Is there a time that you shouldn’t test a marketing campaign’s ad creative? This was a question raised by Amrita Mathur, VP of Marketing at Superside, at one of our Ad Designpalooza sessions. The panelists, Shanee Ben-Zur, CMO at Crunchbase, and Hakim Garuba, Performance Marketing Manager at Square, agreed that while testing is important, there are some cases when it simply doesn’t make sense.
Here are some quick testing rules to follow:
As performance marketers, A/B testing or split testing is not a new concept. It’s a marketing tactic that we often use to enable us to make data-driven decisions, especially in digital advertising. We research, analyze, hypothesize, test, launch and the cycle repeats itself. If executed properly, the results can have a positive impact on our bottom line.
There have probably been many times where you redesign a banner ad, revise a landing page design or make minor tweaks based on trends or personal preferences. The truth is this method doesn’t always work, especially if you’re looking to scale, test, and iterate complex designs. Being a performance marketer often means finding the optimal mix of offer, copy, and design. Therefore, if you're not split testing or landing page a b testing your designs, are you really optimizing for performance? To get you ready for split testing, this post will cover;
A/B testing is an important evaluation tool in the design process. The design process isn’t just an art; it’s also a science that relies on cold, hard data from the research and ideation stages. Determining which version of your landing page has the most impact, or which is the winning variation of your social ad is key in understanding the types of design elements to use to improve conversions or increase the number of leads.
3 reasons why every performance marketer needs to be testing variations of their designs:
TIP: Your designers shouldn’t be just task takers— the real magic happens when designers are given the chance to be strategic, creative thinkers. But they can’t do this without knowing how their designs are performing, and what the campaign goals are.
A/B testing forces you to evaluate every aspect of your creative. This means that while creating multiple variants, you’re also creating a list of potential improvements that may be statistically significant. Consequently, using an a/b testing tool makes the final version of the page or creative better for your target audience.
A/B testing may be a longer process, but when executed properly, will help you improve clickthrough rate, convert more leads, and ultimately, increase sales. Better customer engagement means more customer conversions, which leads to increased sales volume for your business. Through testing designs and different call to action buttons, it is possible to improve conversions rates.
With A/B testing, it’s simple to analyze real, factual results—no more guessing. The data helps you determine a “winner” and a “loser” based on straightforward metrics (i.e. time spent on page, conversions, etc.). There’s no ambiguity or room for swaying your opinion over the data. As you know, the most beautiful designs aren’t always the most effective, so A/B testing lets the data speak for itself!
A/B testing can help you examine website visitors and audience behavior on your creative before committing to major decisions and significant company budget.
We should also mention that there are a few downsides to over-testing your creative. Although testing designs can often lead to better outcomes in the future, there are times when it doesn’t make sense to test at all. If you’re keen to learn more, we’ve got a whole blog on how to balance your data and creativity! Use the data wisely—the data should work for you, you should not work for data.
A/B testing a handful of variants is manageable, but have you ever tried to do it for dozens, if not hundreds, of variants? Let me put it into perspective for you:
As the Performance Marketing Manager at Superside, I recently launched a Facebook ad campaign to test which ad creative resonates with our target audience. We ended up with 36 design variants and two different text elements with a total of 72 ad variants. Thankfully for us, Superside has a scalable design team to help us execute this. This may not be the case for you.
It’s important to involve your design team throughout the design process, which includes reviewing results from your tests. This helps your designers understand what you’re testing and why, so that they can constructively contribute with ideas on what to test next.
Advantages:
Disadvantages:
If you don’t have the internal resources to scale design, this is an opportunity to work with agencies, freelancers, or a solution like Superside who can build creative iterations and tests for your campaign.
Advantages:
Disadvantages:
Review and acquire tools that will make A/B testing design at scale easier for you and your design team.
Now that we’ve covered most of the basics of A/B testing design, the last question we want to ask is how do you conduct efficient A/B tests for design? We spoke with Superside’s Amrita Mathur, VP of Marketing and Anneke King, Executive Creative Director, about their best practices when planning and carrying out A/B testing, and here’s what they shared with us:
To start your A/B testing on the right foot, determine what variable(s) you want to test – copy, graphic, layout, etc. You can test one variable at a time or test more than one variable (known as a multivariate test) depending on the complexity of your A/B testing. It’s also important to have key performance indicators (KPIs) defined and baseline results to compare to (ex. current conversion rate).
If you haven’t already, loop in your designer or design team at this stage. As previously mentioned, Superside recently launched a Facebook ad campaign with 72 ad variants. When we knew we’d be A/B testing, we looped in Piotr Smietra, Marketing Creative Director, to get his input on what variables we should be testing and an idea of what the timeline would be to deliver those 72 ad variants.
A solid, user-centered hypothesis is a must for your A/B test. A strong hypothesis can keep your evaluation focused on the relevant metrics rather than filling gaps with assumptions. Without a strong hypothesis, you can easily determine if an A/B test is even worthwhile to make and test those variables.
A few months ago, Superside conducted an A/B test on our top of funnel landing pages. This was our hypothesis: “If we change our landing page background to a more lively/colored background, then we’ll see an increase in conversions, because the webpage is more appealing to our audience.” And the results? Our hypothesis was true, so from now on you’ll be seeing more lively landing page backgrounds from us!
In an ideal scenario, your A/B test has a “eureka!” moment where there is a clear winner and loser. Sometimes, A/B tests fail to produce significant results. If that happens, run your experiment longer. If your hypothesis is disproven, don’t be discouraged. Even a failed test can share insights and opportunities for your next iteration, only increasing your chances of succeeding.
When you focus too much on “data-driven” design, user experience can be negatively impacted. A/B testing tends to focus on the quantitative data, but it’s also important to set qualitative KPIs (ex. registration, form submissions, etc.) when running an A/B test. A high-traffic landing page may sound great, but if the page isn’t receiving form submissions, then something needs to be changed to positively influence the user.
Conducting an A/B test in a shorter amount of time won’t get you your results any faster. That’s why it’s important to give sufficient time to conduct a thorough A/B test with enough participants or iterations to achieve strong, concrete results. As the saying goes, “slow and steady wins the race!”
Things change over time and that applies to A/B testing. Make A/B testing a regular part of your process to ensure that you have the most up-to-date data on your design variables. A high-performing CTA button won’t always be “high-performing” and that’s when you’ll need to A/B test to see if there’s a better way to do it.
Every marketing team has a method to A/B testing. At Superside, we believe in the process of iterating and optimizing to understand our customer base and market at large. From ABM to content, our marketing team A/B tests design regularly. It also helps that we have a design team and creative support to help us execute (yes, we use our own solution as well!).
Take this as your sign to start A/B testing design, whether that’s landing page designs or digital ad designs. The results of your creative tests provide valuable data that gets fed back into the design process to iterate over and over again until you have the winning variant.
For performance marketers who want to A/B test design at scale, these are the three main points to remember:
If you have started A/B and multivariate testing design, keep going because you’re on the right track!
Get high-quality creative, ship campaigns faster and stand out from the competition.
Never say no to another project request. Get a hassle-free creative partner that can keep up.
Allow your in-house creatives to focus on more strategic projects. Get new ideas & continuous design inspiration.
Increase your design capacity without additional hiring and with fewer vendors to manage.
Get a demo and discover how 450+ ambitious companies and 2,500 energized fans use Superside to free themselves from the shackles of limited budgets, broken processes and stretched in-house teams.