A project strategy that is set in stone is destined to crumble.
You need fluidity and flexibility because it’s inevitable that you will work on a strategy that doesn’t live up to expectations. And when you find yourself in that situation, you need to be able to pivot before you get too far down the wrong path and spend way too much money.
This applies to any aspect of marketing but I want to focus on SEO as an example.
In the good old days (even though I wasn’t around for them), Google’s algorithm could be gamed because we understood that links and keywords were the answer.
But now it’s not that clear anymore. Google is getting smarter, especially as machine learning (you can substitute RankBrain here) plays a larger role. Domain Authority doesn’t always stack up when trying to understand why a competitor is ranking above you. Having keyword-rich meta data isn’t a guarantee either.
If Google’s mission is to solve people’s problems, then they need to source the best answer, even if that means that the title is unoptimized or the domain doesn’t have as many links. Whereas before we could rely on best practices (such as getting the most links), now we need to continually prove what will work in our vertical and for our domain.
We can continue to use best practices as a springboard for what we should be working towards. But we need to take it upon ourselves to not follow those “rules” blindly and to evaluate them every time we start on a new project. It’s not “one size fits all.”
With one change to your process, you will be more confident in your strategies AND you’ll be in a better position to actually reach your goals.
How? It’s simple: Lead with testing.
You should always test your strategies & tactics before executing in full.
There are a few things that you’ll notice immediately if you start incorporating testing into everything that you do.
- Stop worrying about being wrong. When you incorporate testing first, you aren’t placing your head on the chopping block. When something doesn’t pan out, it’s not a complete failure (notice I didn’t say if); it’s a chance for you to learn and turn into something else that will work.
- Avoid the waste. You can’t possibly know what’s going to work in every situation because each one is unique. By testing first, you find what works for a specific scenario, learning from the successes and tossing out what just doesn’t cut it.
- See how your work ties back to project goals. If you start testing a strategy that you thought was going to contribute 50% to your goal but then realize it will be lucky to drive 25%, then it’s a great opportunity to rethink the goals or rethink how you get to those goals. The advantage with testing is you can make these assumptions much earlier in the project. And by not wasting your resources on the stuff that doesn’t work, you can reallocate them to what needs to be done to accomplish the goals.
Let’s walk through a few examples of recent tests that have been instrumental to the projects in which they were conducted.
Example #1: Quick Redirect Test Resulted in a 600% YoY Increase in Traffic
Our main focus on this account is optimizing existing content and creating new content, as this will lead to the improved organic visibility. The only problem is that implementing these types of recommendations requires a lot of time on the client side.
We’ve broken down those recommendations into small batches to help their team implement them, but even those smaller batches were difficult to push through. That’s when we decided to test something else in the meantime that might aid our future efforts.
While searching through GA, we found that there were several old URLs that appeared to drive significant traffic in the past but were now 404ing or improperly redirecting. We asked our client if redirects would be possible to implement quicker, they said yes (booyah), and so we sent over the list that we put together with little effort.
A few weeks later, we checked the impact of organic visibility and didn’t really see anything striking across all categories. But it turns out there was a diamond in the rough.
One page in particular saw a significant lift in traffic and conversions right after we implemented the 301 redirects. We are now ranking in position #1 for the head term of the page (a result we weren’t aiming for) after historically ranking in positions #3 – #5.
What we learned from this test:
- The page that this ranks for is actually in a less competitive SERP which allowed us to easily and quickly take it over. Now we know this is a content gap that we can continue to take advantage of.
- We further confirmed that meta data optimizations + improved content will probably be the best way to improve organic visibility across the entire domain since passing link value to other pages didn’t work as well.
Example #2: One Meta Description Change Leads to 57% increase in CTR & 25% increase in Conversions
This example comes from Marianna Morris + Kati Polaski, two of my colleagues at Seer who continually throw down amazing examples of how to crush marketing, in general. #SeerPride
On one of their accounts, the client thought their audience came from a particular career background before transitioning from their old vertical to this client’s industry. During audience interviews they conducted, they found that none – absolutely zilch – of their interviewees had that particular background.
What they did next is smart.
Rather than scramble to pivot their strategy based on the insight gained from the interviews, they propose to validate their findings with a small batch test. Basically as small as you can get. Changing one meta description.
The test was to include language that would better target their “new” audience and see what happened. Turns out this led to a 57% increase in CTR and a 25% increase in conversions post-implementation.
Now this test isn’t conclusive in itself. But what it did is crucial: it opens the door for more testing and validation of the information they gathered in their interviews.
Example #3: Testing a Lesson from a Conference Led to a 170% Increase in Organic Traffic
I like this example from Alisa Scharf not only because it’s a small batch test in action, but because she executed something she learned at a conference (how many times have we done nothing with what we learned!)
At Mozcon in 2016, Wayfair presented on a topic about internal linking practices. This specific session aligned with a client situation who had a large amount of internal links pointing to various sections on the site, creating dozens of internal links on pages that weren’t always relevant.
The client also happened to be at Mozcon, and this opportunity immediately struck a chord with both parties. So they decided to put it to a test by removing internal links from ~40 pages. The hypothesis was simple: let’s stop diluting the authority of our deep category pages and see if we get results similar to what Wayfair achieved.
A few weeks later, they looked at the data and saw this: a 170% increase in organic traffic. For context, this segment of pages only represents about 1,600 additional monthly sessions. But, it’s a test that can be replicated across hundreds of pages!
If you’re thinking “Aw yeah, time to crank up the testing!”, don’t forget that it needs to be baked into your process. Some lessons learned along the way:
- Small batch testing is key. I cannot emphasize this enough. Break up the entire project into easily executable chunks that’ll allow you to be nimble and get data to work with. Even if your sample size is too small during the first test, keep iterating until you are confident that you should full steam ahead.
- Be committed to the post-test analysis. You must have capabilities and time to accurately measure what you just tested in order for it to be worthwhile.
- Only test what you can act on. As you ramp up with more and more testing, you’ll need to be careful not to lose sight of the end goal (or goals). Make sure you have a hold on your existing tests before moving onto the next one!
Big shout out to Ethan Lyon, our Innovation Lead at Seer, who originally taught me how important testing is on a project. And now I ask you: Got any clever ways of how you’ve used a test that led to success or helped you avoid a roadblock later?