One of the first things you learn as a marketer is that your audience is everything. It drives conversations around your brand, fuels your business and informs the content you create and share.
There are certain assumptions you can make about what creative assets will resonate most with your target audience based on industry benchmarks, platform demographics and general observations. But why assume when you can take the guesswork out of it with creative testing?
Creative testing is a way to assess which images and copy combinations support your goals best. It can help your brand enhance ad campaigns before they launch, create more impactful creative assets, understand which concepts steer your audience through the marketing funnel and so much more.
Backed by data, marketers can apply logical and results-oriented insights to their strategies. At the same time, they can relay these insights to their creative partners to show the value of their work and inform future projects.
At Sprout, the test and learn approach is tried and true. In the last year, Sprout’s organic social media, digital advertising and creative teams put a renewed focus on testing and built a framework to help them track, understand and improve creative results. In this article, we’ll share the most valuable lessons from our experiments that can help guide your own creative planning.
The seven lessons we’ll cover are…
- Testing requires transparency between marketing and creative teams
- Start with a broad hypothesis then zero in on specifics
- Paid and organic can and should work together
- Testing is an opportunity to build relationships
- You can still test on a tight budget
- Determine the testing structure that works for your goals
- Everything is a learning experience
Testing requires transparency between marketing and creative teams
Sprout’s creative testing initiative focused on a full-funnel approach, from awareness to acquisition. While the organic team focuses on increasing awareness of our brand—measured by impressions, clicks, engagement rate and video views—paid focuses on acquisition through direct response and lead generation.
At Sprout, the goal for direct response campaigns is to get users to sign up for a trial or request a demo, which then ultimately leads to a subscription. Lead generation campaigns promote gated content and require users to fill out a form in order to download a free guide or watch a webinar.
Creative assets and messaging pave the path to those goals. They’re what make your audience click on an ad, watch a video, like a Tweet or download gated content. Testing, therefore, requires transparency between social and creative about your hypotheses, performance and results.
Start with a broad hypothesis then zero in on specifics
Your hypothesis is the starting point for your testing and should focus on specific assumptions you can clearly prove or disprove through testing. The key is to start broad and get more specific as you test. For example, perhaps you start by testing illustrations vs. photography, or video vs. static images.
Marketers often think video is king and video converts the best, but it actually depends on your objective. “Video ads are great for awareness because people will watch the video and hopefully take something away from it. However, that doesn’t always translate to direct response, which we found through testing,” says Shelby Cunningham, Sprout’s Digital Marketing Lead. “Getting people to watch a video, go to a landing page and sign up for a trial just didn’t work for us. We’ve actually found that static ads drive a higher conversion rate for direct response, which is not what we would have assumed at the beginning.”
Sprout’s organic social team also had hunches about what kind of content and creative works for awareness goals. Testing creative assets and sharing their results with the creative team helped them get buy-in for new and different types of creative assets. In the last year, the organic social team’s main KPI was impressions, and conversely to paid, video was driving those metrics. With that data in their back pockets, the team made a case to spend more time on video rather than just static images.
Once you hone in on specific metrics that identify what is successful and what is not, you can get more granular about what you’re testing. If you find that static photography or illustrations work best for your goals, you might consider testing whether people or product-focused imagery converts better. Or perhaps you want to know if a headline in your image would drive more clicks than an image without a headline. These are all creative variables you can test.
Paid and organic can and should work together
“The organic and paid teams both have our own individual goals, but a big part of the last year was finding ways we could work together more cohesively and therefore work together better with brand creative,” says Cunningham.
The Sprout Social Index, one of Sprout’s largest campaigns each year that is built around an annual data report, requires collaboration across the organization and open lines of communication between the organic and paid teams.
“Knowing the ideas our organic team already has in mind for a great video or promotional images for the Index, I can easily say, ‘I love this idea, and here’s how I think we could tailor what you’re already doing to get a version for paid,’” says Cunningham. “Then, we can come together and have one ask for creative that’s tailored for both paid and organic but based on the same concept.”
Having these open conversations can help build a stronger relationship between organic and paid teams—which can provide huge benefits beyond the realm of creative collaboration. While these two teams have different goals and metrics they’re focused on, working together on testing is a great way to find commonalities and ways to maximize the resources you already have.
It’s all about building trust, transparency and credibility. We’re not just saying, ‘Our gut is telling us that we think a colorful illustration is going to work better.’ We have the data to prove it.
Shelby Cunningham
Digital Marketing Lead
Testing is an opportunity to build relationships
Testing is a great way to bridge the gap between social and creative teams. At Sprout, all stakeholders from the paid, organic and creative teams come together for a quarterly meeting to discuss metrics and testing results. On the creative side, team members from visual design, web design, video and copy join the meeting.
With a diverse group of contributors all in one room, everyone gets a full picture of what’s working and has a chance to weigh in with their immediate thoughts, ask questions or share ideas for the future.
“It’s incredibly valuable to have creative and marketing teams aligned around data. For creatives, such quantitative data gives us a bit more insight into what resonates with our audiences at scale on different platforms and channels. That helps us work on designs that can best perform with the different audience segments we are targeting,” says UX Designer, George Mathew. “It’s a challenge though, since this is often a fast-moving target and you have to balance changing user needs and tastes with our own evolving brand identity and strategy.”
The closer marketing and creative collaborate to understand the end user, the better the results tend to be.
George Mathew
Senior UX Designer
In addition to sharing data, these kinds of meetings drive conversations about the why behind creative requests. “It’s all about building trust, transparency and credibility,” says Cunningham. “We’re not just saying, ‘Our gut is telling us that we think a colorful illustration is going to work better.’ We have the data to prove it.”
Creative testing ladders up to your social goals, but is also a great way to support the goals of your creative team as well. “This kind of collaboration has definitely improved our relationship with creative. The designers and the video team can now see the impact that their work has on our overall strategy based on all of the super niche content types that we’ve created,” says Olivia Jepson, Sprout’s Social Media Specialist.
“And that doubles for paid,” says Cunningham. “In many cases, designers will create the images or ads and then hand them off, but showing how much their work affects what I’m doing as well has helped build a more productive relationship.”
Another outcome from increased collaboration between social and creative is efficiency. “There were so many times in the past where the designers would have finished a project for organic, then I would see it and request revisions for paid,” says Cunningham.
Collaborating early on saves time and also gives designers a chance to freely share their ideas for a story or assets that they’ve been thinking about and are interested in testing for their own purposes.
You can still test on a tight budget
If you have a limited budget for testing, or none at all, run tests through organic first. For example, rather than testing a photo versus an illustration and having to pay for both of those ads, just test it on organic, free of charge, and then whichever one works, put that towards paid, or use your findings to make a case for using paid ads.
“All social media managers are already doing a lot of testing, even if you’re not necessarily thinking of it that way or being purposeful in it,” says Jepson. “You’re always posting, so why not track everything you do and learn from it? If you do a little more work up front, like organizing your posts with tags, it’s going to be so much easier for you to pull data and find insights down the road.”
Boosting is another really good way to get started in paid. “If you have an organic post that’s working really well, you don’t even need to create a new ad. You can just put $50 behind it and see how reaching a wider audience impacts your goals and hypotheses,” says Cunningham. Even if you don’t necessarily have a limited budget, boosting can give early insight into performance and inform whether you should invest more or pull back.
Determine the testing structure that works for your goals
There are a lot of factors to account for during testing. The reporting windows you’ll want to use—that is, how long you plan to run each test—depends on your budget, the size of your audience, your KPIs and more. The most important thing to focus on is reaching statistical significance, which quantifies whether a result is likely due to chance or to some factor of interest.
“You may say going into testing that you are not going to determine the outcome of this test until you reach statistical significance or get 10,000 clicks, whichever comes first. There are no hard and fast rules, so you can figure out the benchmarks and structure you want to set for yourself,” says Cunningham. “Just as with everything else with testing, it will take time to find what works for you. Always have a goal for every campaign, and let the data guide you to finding what works.”
You’ll also want to plan to regularly refresh your social ads. Some best practices that Sprout follows for different platforms are as follows:
- LinkedIn and Facebook: Refresh ads once a month by changing copy and/or image.
- Twitter: Refresh ads biweekly.
- Google Display Network: Refresh ads as needed, but these can be evergreen.
When it comes to testing, always be flexible, patient and willing to try new things. Sometimes you might have to run tests multiple times before you’re able to achieve that ultimate goal.
Shelby Cunningham
Digital Marketing Lead
Everything is a learning experience
You set your hypothesis and a goal from the beginning, but it’s important to be open to tests not going the way you think they will.
“When it comes to testing, always be flexible, patient and willing to try new things. Sometimes you might have to run tests multiple times before you’re able to achieve that ultimate goal,” says Cunningham.
For example, if you set out to lower your cost per acquisition and you changed your content type but didn’t achieve that result, figure out a new kind of test. Maybe instead of changing the content type, change the placement you’re running.
Above all else, marketers should accept that there is no one size fits all for creative testing. Results can be as unique as you and your brand are. If you don’t reach your goal on your first go around, just try again. Take each challenge as a learning experience and not as a failure. So go forth and get testing!