Comparing A/B and Multivariate Testing


How often do we attempt to decide the best color for an add-to-cart button, or what subject line to use in an email, or the products to feature on the home page? For each of those choices, there is no reason to guess. An A/B test would determine the most effective option.

A/B tests are controlled experiments of two attributes, to measure which one is most popular with users. You can apply A/B testing to just about anything that you can measure. Here are some common uses for ecommerce.

  • Email marketing. Test subject lines, body copy, layouts, personalization styles, images, closing texts, and headlines.
  • Promotions. Test marketing text, such as “50% off” versus “$20 off.”
  • Website. Test colors, banners, product features, and call-to-action button colors and text.
  • Advertising. Size of ad, copy, colors, images, and call-to-action text.
  • Social media. Post length, images, use of emojis, styles, and even hashtags.

A/B tests use a control group and test group. Each test changes only one element. The test measures whatever factor you choose, such as the number of clicks, conversions, opens, or shares.

Say you are considering two promotions for your home page: “Free shipping on all orders over $50” and “10% off if you spend more than $50.” You want to know which one produces the most conversions. To conduct the test, you would default 50 percent of your home page traffic to one offer, and divert 50 percent to the other.  You run the test for, perhaps, one week. After that, you tally the results. “Free shipping” has a conversion rate of 2 percent while “10% off” has a conversion rate of 0.5 percent. Therefore, you go with the “Free shipping” promotion.

Most marketing automation tools, email service providers, and digital advertising platforms include A/B testing.

Multivariate Tests

Multivariate testing allows you to measure multiple variables simultaneously. This could be, as examples:

  • Call-to-action button. Testing the color and the text.
  • Email marketing. Measuring the combined headline, body text, and image.
  • Social media. Posting the same post on different days and times.
  • Promotions. Testing combinations of products and promotion types.

An accurate multivariate test requires much more data than a simple A/B test. Low-traffic websites and small email lists cannot reliably conduct multivariate tests. But websites with thousands of products and tens of thousands of monthly visitors can utilize a multivariate method to find a winning combination faster than A/B tests — and dramatically improve performance.

Mathematically, multivariate tests are more complex than A/B. Using a proper tool — Symposeum and Optimizely are examples — can go a long way to ensuring accuracy. Otherwise, to identify the winning combination you would have to use sophisticated measurements, such as regression models, multivariate analysis of variance, or cluster analysis. Regardless, you cannot confidently identify the winning combination without enough data.

For example, say you are considering two sets of elements on your website: two colors for the add-to-cart button and two promotions. You want to know the combination that produces the most conversions. You therefore test the four combinations of those elements and divide your traffic into four parts. For a site with 10,000 monthly sessions, each combination would receive 2,500. You run the test for 30 days. Consider the results.

  Sessions Conversions Conversion Rate
Button Color 1 + Promo A 2,500 25 1%
Button Color 1 + Promo B 2,500 27 1.08%
Button Color 2 + Promo A 2,500 23 0.92%
Button Color 2 + Promo B 2,500 20 0.8%

The combination of “Button Color 1 + Promo B” appears to be the winner, with 27 conversions. But it is only two more than “Button Color 1 + Promo A.” Moreover, “Button Color 1 + Promo B” could have a statistical bias due to a relatively small testing size of 2,500 sessions. Therefore, the statistically safe approach could be to run the test for another 30 days (to confirm the results) and only test the first two combinations — “Button Color 1 + Promo A” and “Button Color 1 + Promo B.”

Combining the first and second month of testing, you get the following results.

  Sessions Conversions Conversion Rate
Button Color 1 + Promo A 7,500 90 1.2%
Button Color 1 + Promo B 7,500 82 1.09%
Button Color 2 + Promo A 2,500 23 0.92%
Button Color 2 + Promo B 2,500 20 0.8%

After measuring the results with enough data, you see that “Button Color 1 + Promo A” is a better performer.

Sometimes tests do not identify conclusive winners. Sometimes they do. But when done right, even minor tweaks can significantly improve performance. They can push the limits of your click rates, open rates, and conversions.



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version