Are You Using the Right Content Marketing Metrics?
21 September 2017 | 10:18 am
Years ago, I took my three year-old to her second dentist appointment. I wasn’t expecting any problems because she had dealt with her first appointment like a champ and I had assumed that the first one would be scarier than the second one. And the second appointment went swimmingly–in fact, she seemed uncommonly cheerful when I told her where we were going. Then. when we got home, she asked, “When do we go to the party?”
She hadn’t been invited to any party, so I had no idea why she was asking that. After some back and forth and some head-scratching conversation with her mom, we realized that she had indeed attended a friend’s birthday party following her first dental appointment, so she had put those events together into one firm (and happy) memory and now was expecting the other shoe to drop after seeing the tooth doctor again.
We were able to explain to her that there was no party for her today, and she understood, but it caused me to recognize something all of us human beings do–and not just when we are three years old. We tend to impute meaning to coincidences. This is deadly when making data-driven marketing decisions.
I heard a story–don’t know if it is true–that back in the summer of 2012, the Sprint social media team was happy when their positive mentions starting increasing dramatically. At least at first. A little digging showed them that the mentions were about the Olympics and that the happy conversations around the word “sprint” in that context was not something they should take personally.
Another time, I showed a set of results to a client and told them we had tested them and that they were 90% accurate. The client took a quick look at the first 10 results on the screen and insisted, “That can’t be true–look, the first one is wrong!” The other nine were correct, which is what 90% means, but he distrusted the system anyway.
These examples probably seem silly to you–because they are mistakes you didn’t make. But I see clients performing unnatural acts with numbers all the time just because no one is really thinking about what they mean.
One former client told me that they use their web analytics to see the conversions related to every piece of content in their system so they know what the best content is. Unfailingly, the “best” content was for their best-selling products. Maybe you think that products are best sellers due to marketing content alone, but I have my doubts.
Instead of using simple correlations of which pages lead to conversions, perhaps they need to dig deeper, as the Sprint team did, to really understand their numbers. If you are ready to dig deeper–to think in a new way–you can use AI analysis to remove a lot of spurious correlations to get to the underlying causes of what is going on. Once you do that, you can really work on improving the right things.
But if you keep thinking the same old way, someone might have to tell you that there is no party for you today.