The bi-annual CMO Survey was recently published and, as always, the report reveals many things that people are thinking, but that few discuss openly.
In addition to reporting on industry outlooks and budget, the CMO survey asks a biting question: “Which best describes how you show the impact of social media on your business?”
Instead of offering them a wide variety of answers, the survey forces its respondents to be clear about their ability to ‘prove’ the impact of their actions:
- Unable to show the impact yet
- Prove the impact quantitatively
- Good qualitative sense of the impact, but not a quantitative impact
And in survey after survey, those who are not able to show the impact of social media outnumber those able to show the impact by at least three to one.
Admittedly, there have been improvements in 2018. Those able to show the impact have jumped from a low of 16% in 2017 to an all-time high of 25% in 2018. But this improvement hides the fact that three out of four CMOs cannot yet demonstrate, quantitatively, the impact of social media using real business metrics.
And perhaps this wouldn’t be a problem, except spending on social media has risen from less than 5% of the marketing budget in 2009 to nearly 14% in 2018.
Marketing managers are, therefore, allocating an increasingly greater share of their company’s resources to a channel with uncertain business impact.
At the same time, and perhaps consequentially, nearly two in three marketing leaders report that they are feeling pressure from the CEO or board to ‘prove’ the value of marketing. Enough is enough, the bean counters appear to be saying.
Why marketers struggle to prove ROI
While there are undoubtedly many factors which have led businesses to ask CMOs to prove ROI, another chart sheds some light on the current situation.
In the same survey, CMOs were also asked ‘what factors prevent your company from using marketing analytics more often in decision making’?
Tellingly, the responses all show growing anxiety among CMOs about using analytics for decision making over the past 18 months. More than a quarter indicate that analytics ‘does not offer sufficient insight’, nearly half say there is a ‘lack of people who can link [analytics] to marketing practice’, and more than half say they have a ‘lack of process/tools to measure success through analytics’.
It seems, therefore, that senior marketers are becoming increasingly more aware of their inability to leverage analytics for decision making.
It isn’t a huge leap in logic to connect these two situations. Marketers are now being asked, as much or more than ever, to prove ROI and they are finding that they don’t have the tools to do so.
So, what can marketers do?
It is perhaps worthwhile to distinguish two approaches marketers in this situation could take.
One approach they could take is to actually prove ROI. To do so, marketers could run a relatively simple experiment. They could identify two similar markets and keep one as a control, in which one market had marketing as usual. The other, the treatment market, would receive no marketing at all. Over time, the experiment would show whether the ‘no marketing’ market underperformed.
As most businesses run on thin margins, however, and as executives, rightfully, worry about losing market share, it is unlikely that marketers will ever be permitted to ‘turn off marketing’ for long enough (say, 1 year) to prove its effectiveness.
Another approach
Another way that marketers could ‘prove’ ROI, though, is to demonstrate that they are careful stewards of their company resources. That is, while it may be undesirable to prove ROI experimentally, marketers could show that they are thinking carefully about a marketing strategy which has been designed specifically to improve business results.
This takes us back to analytics. To ‘prove’ ROI in this way, marketers need to divide their analytics programme into three distinct tasks:
- Reporting: Stating what has happened. KPI reports, etc.
- Exploratory: Looking for patterns in data to find new opportunities.
- Diagnostic: Analyzing data to determine why something happened
While each of these are essential marketing activities, they are not equally effective in proving to the business that marketers are interested in delivering value to the business.
As for 1), marketing reports offer useful indicators of marketing performance, but they are typically full of jargon and incomprehensible to executives.
Regarding 2), exploratory analytics is indeed valuable to the business, but it is difficult for non-marketers to appreciate the results. Few people outside of marketing care about the details of market segments or channel performance.
To prove ROI, then, marketers should spend more time working on 3). Diagnostic analytic reports can show that marketers spend their time actively exploring why their efforts are or are not working and how they could be more effective at improving key business metrics.
How diagnostic analytics can help to prove ROI
For marketing analytics to help marketers prove their value, though, marketers need to conduct an honest assessment of their activities. Specifically, they need to state before they start a campaign what impact they expect their campaign to have, citing figures. In other words, they need a hypothesis.
A hypothesis is simply a statement which is written in such a way that data will either confirm or disprove it: “We believe this campaign will raise brand awareness in our target market by 10% in Q1, 2019”. “This campaign intends to raise brand awareness” won’t cut it.
So, marketers need to think about what they do in the style of SMART objectives. That is, hypotheses should be specific, measurable, actionable, relevant and time-bound. A better example of the hypothesis would be “We are doing X because we believe it will increase brand recall among the target market by 10% over the previous quarter”.
With such a hypothesis, marketers can review data at the end of a campaign and assess whether they were correct. If not, they can assess supporting metrics which may have contributed to the result. Perhaps the campaign didn’t get the reach they expected or the item which was measured, say recalling the logo, was not prominent enough in the advertising.
Finally, with careful analysis, they can diagnose problems, make corrections and write down a new hypothesis for the next campaign. Such an approach should improve results with each iteration.
Marketers who take a methodical approach to decision-making demonstrate that investments in marketing are being taken seriously. Then, when asked about their strategy, marketers can respond confidently with their initial plan, subsequent performance data and how they evolved their strategy in accordance with the results.
Will this be enough?
Well-presented and meaningful analytics may or may not be enough to keep the critics at bay, but aside from running the undesirable control experiment described above, diagnostic marketing reports are a good start toward defending the marketing budget.
Besides, what is the alternative? Telling senior management that marketers decide how to spend company profits using previous experiences, intuition and industry ‘best practices’?
Perhaps that approach has worked in the past, but if personal biases are the only rationale behind how marketing decisions are made, then marketers shouldn’t be surprised when they are asked to prove their value to the business or, indeed, how difficult it will be for them to make their case.