One of the great benefits of the new digital marketing world is the opportunity to monitor and respond to actual prospect behavior. No longer are we hamstrung by clumsy tactics that prevent us from fine-tuning our response to prospect behavior.
No longer are we tossing all our ideas into focus groups where people tell us what they think we want to hear. We now can adjust our marketing nurturing process to what prospects actually do, not just what they say.
There is much that is efficient, intelligent, and creative about A/B testing to optimize expression of relevant ideas. This is a beautiful and happening thing.
But there is a growing tendency to misapply it, which can lead us to marketing oblivion (or as I like to call it, “ablivion”).
There is a brave new world in marketing and marketing strategy. And the thinking goes that it is smarter and faster to circumvent understanding your prospect persona and what messages are relevant to them simply by setting up a series of A/B tests to automatically optimize your way to nirvana. Kind of like an army of monkeys eventually typing out the collected works of Shakespeare.
In fact, a CMO recently told me that before he arrived at his new company, the marketing strategy was to simply implement a series of A/B tests to slowly optimize their responses to their marketing communications.
While I believe that all learning should be agile (like A/B testing), I want to point out two overlooked consequences that this overreliance on A/B testing can cause―misinterpreting behavior and turning away prospects.
What Sudoku Taught Me about A/B Testing
I never waste time playing games on my iPhone.
OK, hardly ever. Like maybe I’ll reward myself with a mindless game after I finish this post. Anyway, I do it as “research.”
You see, I am fascinated by the variety of ads someone paid to show me after I finish a game of Sudoku or whatever. In particular, I am fascinated by a new series of ads that cannot be skipped for several seconds. It’s annoying. Not only because it’s a longer interruption than usual, but also because the ads are irrelevant to me.
And here’s my point―I bet the agency that created the ads is telling their client that they have increased ad viewing time―“engagement”―by 50%! The behavioral metrics (viewing time) look fantastic!
Recommended for You
But here is what that metric is missing: while it’s running longer on my screen, I’m not looking at it. I hate it. While this “B” campaign looks like it’s outperforming the “A” campaign, it’s actually having no positive impact and starting a negative one.
This is a simplistic example, but you get my point. A behavioral measure (like viewing time) is never more than an indicator of what we think is going on (brand engagement).
Behavioral measures have face-validity, but if we don’t check in with the humans on the other side, we can be quite misled. Digital intelligence without human intelligence is no prospect intelligence at all. Game over.
Strategic Spaghetti on the Wall
The other cost of “extreme” A/B testing without some “prospect intelligence” to start is paid by your prospects … and then by you.
Think about what A/B testing means in terms of prospect experience (I call “PX” the “CX” or “customer experience” of marketing). That messaging “spaghetti” you are throwing against the wall is actually being “thrown” at your prospects who are looking for a solution they think you could provide.
But you are tossing out a lot of “sub-optimal” messaging as you A/B test your way to relevance… which is teaching them that your brand is, in fact, not relevant to them.
This can be very costly, not just in the present but even more in the future. By not starting with some persona-based prospect intelligence, your pre-optimized messaging may be desensitizing your limited number of prospects (perhaps as few as 10,000 in some B2B industries), to dismiss you as a brand … forever!
Marketing Strategy Is Dead. Long Live Strategy.
So, what to do? As in most things in life, balance is beautiful. From our experience, the most efficient path to prospecting perfection combines both sources of prospect intelligence.
It starts with personas that provide the fundamental understanding of what prospects are seeking, thinking, and feeling. This learning does not rely on focus group or massive surveys, but rather on in-depth conversations about how and why your prospects choose you … or not.
Those insights are the basis of creating messages that A/B testing then can efficiently optimize. Without wasting or budget. Or spaghetti.