RSAs vs. ETAs: Stop Comparing CTR


Testing new platforms, bid strategies, ad types, etc. can be exciting, but also daunting. Exciting, because you think you’ve found the newest way to enhance performance. Daunting, because you can’t guarantee it will work, and you might be unsure how to quantify the results. But isn’t anxiety part of any test? It doesn’t have to be.

stop implementing ppc skags in account structure

Recently, I was looking for ways to improve ad quality for a client. I decided to test out Responsive Search Ads (RSAs) to see how they would perform compared to expanded text ads (ETAs) and text ads (yes, text ads are still producing conversions at a higher CVR and lower CPA for this account!). I will walk you through how to accurately analyze RSA results, as well as share the data from my own tests. 

If you need a refresher on RSAs, skim this article.

Theory, Recommendations & Analysis Process

We will not be comparing CTR between ETAs and RSAs. Why? For one, RSAs will show up for different queries, devices, etc., so it’s not an apples to apples test. And two, per a current Google Agency rep, Google created RSAs to be complementary to ETAs: they will win some auctions ETAs wouldn’t have and vice versa. 

Think of it like owning a donut shop. People flock to your shop because they love your donuts, but you are looking for ways to increase your revenue so you start selling coffee. After one month of selling coffee, you calculate your revenues and find that coffee made up 20% of your sales. Are you disappointed because your sales didn’t double? No, of course not! You were able to capture sales that you had been losing out on. That was the whole point of selling coffee. It would be silly to stop selling coffee because you compared it to your donut sales.

I know comparing donuts and coffee to RSAs and ETAs is a silly and simplified analogy, but I hope you get the point: RSAs are meant to capture traffic your ETAs might be missing out on. 

Recommendations:

  • You should have at least 50 impressions a day per ad group  
  • Best results will generally occur in non-brand campaigns.
  • Avoid testing during a period of seasonality and during other major campaign changes (bid strategy tests, new audiences/geos, etc.)

Incremental Analysis: 3 Easy Steps

Evaluating RSA performance should be based on incremental clicks and conversions rather than click-through-rate. Follow these three steps:

  1. Create 1 RSA per ad group in the campaign(s) you want to test
  2. Run for 1 month (Even if you have a high traffic campaign, Google’s algorithm takes 1-2 weeks to determine the best combination for an RSA.)
  3. Calculate incremental clicks and conversions by taking 1 month of RSA data – 3-month average (prior to RSA launch date). This analysis will help determine if there was an overall lift or detraction in performance.

You can also set up an experiment in Google Ads if that is your preference. Simply create an experiment and add 1 RSA per ad group to the campaign you want to test. Don’t pause ETAs in your experiment, but make sure the RSA is only added to your experiment, not the original campaign. 

Results: Two Case Studies

Case A

Following the above process, I tested RSAs in 3 non-brand campaigns for the client I mentioned at the beginning.

The results were less than stellar. None of the campaigns saw incremental clicks and only one of the campaigns saw incremental conversions (Campaign 3). We decided to immediately pause RSAs for Campaign 1 because we didn’t see incremental gains and CPA increased (important KPI for this client). However, we continued testing RSAs in Campaigns 2 & 3 in order to collect more data before making a final decision. 

So why am I sharing an example that doesn’t make me look like a PPC genius? The point is that the client wasn’t disappointed with the results because I explained why we were testing RSAs and how we would analyze the results. If I had not explained incremental gains, this test might not have made sense to the client.

Case B

Earlier this spring, we worked with a client that required us to build campaigns from scratch. Initially, we were only running ETAs, but after campaigns ran steady for several months, we launched RSAs to see if we could improve search performance.  Here’s a snapshot of the incremental data:

Compared to Case A (though I’m not suggesting you should compare results between different accounts), we saw incremental conversions across all three campaigns. You will notice that we had incremental conversions for Campaigns 1 & 2 despite spending nearly 50% less*. Although we continue to see better conversion rates for ETAs (roughly 2% higher), we continue to run RSAs because we receive incremental conversions. Cutting our CPA by over 50% has also allowed us to invest more budget in our top-performing campaigns. 

*Disclaimer: This contradicts my recommendation of not testing during major campaign changes. If you spend at least $10k per month, you don’t want to analyze ad performance when you have more than a 10-15% budget fluctuation. For this client, launching RSAs wasn’t a major “test” so we didn’t take into account adding RSAs when we optimized campaign budgets.   

Conclusion: Anxiety free testing = “Why” + “How”

Reduce your anxiety level and increase your stakeholder’s confidence by laying out a clear plan of how you will analyze the results. It’s important that everyone involved not only understands why you are testing but how you will come to the conclusion once the test is over. We often gloss over the “how” because we are so focused on “why” in order to get the buy-in and get the darn test running. Don’t make this mistake!

If you are looking for more ad testing resources, check out Life After A/B Ad Testing.

Source: Google Responsive Search Ads Performance Impact Guide





Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version