In today’s fast-paced world of digital marketing, A/B testing has become a mainstay for brands looking to optimize their email campaigns. While many are using this technique, not all are harnessing its full potential. In this blog, we’ll dive into how A/B testing, once a niche approach, has now become a cornerstone of email marketing. We’ll uncover the common pitfalls that limit its effectiveness and offer insights into maximizing the overall benefits of A/B testing.
It’s clear that A/B testing is the best way to continually refine email campaign strategies over time and improve performance through data-driven decision making, but how can we perfect this strategy as marketers? Let’s jump into the basics…
A/B testing is also called split testing or variable testing. When you conduct an A/B test, you compare two variables to see which one performs better. Email A/B testing uses this process to test variations of emails or email campaigns against each other to see which performs better for a specific metric, such as open, click-through or conversion rates.
Companies typically employ this technique by segmenting their email list into two groups, version A and version B, to test different variations of an email. Often, the split is even, or it can be done as a 10/10 split, with the remaining 80% of the list receiving the winning version. More advanced approaches include using holdout groups, where emails are tested on a subset (i.e: 10% or 20%) of the regular email list for that specific segment.
After a designated time frame, typically 1 or 2 hours, sufficient data is collected to determine which version performs better. The winning version is then sent to the remaining members of that segment, helping email marketers refine their campaigns for maximum effectiveness.
You can test simple or complex variations. Here are a few examples of A/B email testing:
Sending variations of subject lines to see if subscribers respond better to certain words, phrases or formats
Testing variations of CTAs — such as the verbiage or where the CTA is located
Launching email campaigns on different days or at different times to find the best time for open rates
A/B testing for email campaigns doesn’t have to be difficult, especially when you use the right software and other tools. These tests are highly effective especially considering more than 50% of marketers use A/B testing to boost conversions. But conversion isn’t the only benefit to A/B testing…
A/B testing has other benefits, including:
Statistical proof for your email marketing decisions — you won’t have to rely on instinct or trust existing processes, which may not lead to the best possible outcomes
A competitive edge over others in the industry, especially if they aren’t also using A/B email testing to improve performance
A better understanding of your target audience and what messaging resonates with customers — data you can use to inform future email campaigns or other marketing strategies, including social media and web content
An increased ability to improve critical email marketing metrics like click-through and conversion rates as well as revenue driven by email
Of course, there are several things to consider when A/B testing. You can only test one variable or element at a time. Otherwise, you don’t know which element is responsible for any improved performance. It’s also important to keep in mind that new privacy regulations from Apple will likely spread to other providers making it difficult to track open rates consistently. This means you should focus on KPIs like clicks and conversions.
If you want to put email A/B testing to work for your organization, we’ve got eight variables you may want to test. Remember, choose one at a time when you set up your tests — otherwise, you muddy your data and won’t get any actionable insight.
The subject line is one of the most important elements of any email because it’s a major factor in whether someone opens the email or not. This element typically shows up in bold right under the sender name or in another prominent location in the inbox.
Subject lines are a common variable of A/B testing because they’re so powerful and because they’re easy to test. You simply send the same email with different subject lines.
Here are some ideas to try when A/B testing email subject lines:
Change the length of the subject line. Up to around 30 characters show up on mobile devices and up to around 55 characters on desktop devices, so start by testing within those ranges. Find out whether your target audience prefers a shorter or longer subject line.
Rephrase your subject line. Test out different words and approaches, such as the difference between “Exclusive offer” and “Limited-time offer.”
Test personalization. Your audience is more likely to respond when their name is in the subject line.
You can also test the inclusion of emojis, symbols or punctuation as well as asking questions.
The preview text, also commonly referred to as pre-header text, is a snippet, summary or sneak peek into email contents. It’s also called the preheader line, and it shows up under the subject line on some devices. It’s not as powerful as the subject line itself, but you will only know if it matters to your audience if you test it.
During this phase consider testing:
Original preview text compared to just including the first line of the email. Sometimes the first line of your email copy is enough to pique someone’s interest – but a custom subject line could be more effective at instilling a sense of urgency or summarizing the contents of the email.
Various calls to action in the preheader. For example, test whether your audience responds better to a directive to open the email and find out more or a more subtle call to action.
Different summaries of what’s in the email. Preview text is short, so it can only include a little bit of information. If you can’t decide what the most important bit of your email is to tease, test it.
Sender name is what shows up in the “From:” field in an email. Emails sent by your brand might show up as “From: ABC Brand,” for example. Or you might create emails that come from specific people: “From: Sue at ABC Brand.”
What sender name will help build a personal connection with your audience best? You can’t know that until you conduct some A/B email testing.
Consider testing options such as:
Including a person’s name instead of the company name to add a human element to email marketing.
Testing full names versus first names only to determine how your audience wants to connect with your employees.
Sending from a different email address may resonate with the audience better because it’s more connected to a product or sounds more professional.
Google the best time to send emails and you’re likely to run across multiple articles stating that Tuesday afternoons are the ticket. In reality, Tuesday afternoons work best for some businesses. That doesn’t mean it will work best for you.
The only way you can know what day and time is best for your audience is to split test by sending emails at different times and narrowing it down for yourself.
You also have to account for trigger emails, which can’t all be sent on Tuesday afternoons. For example, you may find that cart abandonment emails work best when sent 2 hours after the person puts an item in the cart and welcome emails work best 10 minutes after sign up. Note that these aren’t recommendations; they’re examples. Run the tests for yourself to find out what works for your audience.
When you’re running A/B testing on email send times, remember to segment by time zone if possible. That way, you can figure out what’s best for each subsection of your audience.
Ideas for A/B testing email send times include:
Testing the day of the week you send emails. Consider setting up a tournament of sorts. Have days of the week compete against each other and use email metrics to determine the champion — and find out if it is, indeed, Tuesday.
Testing times of day. Does morning or evening work better for your audience? Do you get more performance during lunch times or in the afternoon slump around 3:00? These are the questions you can answer when you A/B test email send times.
Testing how long after a trigger you should send emails. Do you get more performance when cart abandonment emails are sent 1 hour later, or do people return and make purchases more often when emails show up a day later?
The CTA tells the email reader what to do next, so it’s pretty important. A/B testing helps you improve CTAs to improve click-through rates.
Consider testing:
The actual words used in the CTA
How often do you include CTAs in emails — does one above-the-fold work, or should you repeat it later in the email?
Does a button work better than text for your audience? If so, can you improve performance further by changing the color of the button?
Whether size, font choices or capitalization make a difference
The location of the CTA in the email
Most marketers agree that short and sweet is best when it comes to email copy. In fact, it can be a good idea to ensure you concentrate on a single idea in an email marketing message. Your copy also needs to be engaging and grab the attention of the recipient.
Of course, “attention-grabbing” is a subjective description, and what captures the eye of one audience won’t engage another. A/B testing helps you determine what copy works best for your audience.
Test factors such as the length of your copy, the words and style of writing you use, whether you include personalization and the tone. For example, does your audience respond better to formal or informal writing?
It only takes a couple of seconds before someone decides whether to continue reading your email or not. Email readers definitely judge the book by the cover, so to speak, so your design and layout matter.
Test out design and layout variations such as whether you include plain text or HTML or send emails with simple designs or messages with many bells and whistles.
An email marketing design that resonates with your audience improves click-through and conversion rates. It can also increase brand awareness and create positive downstream effects on marketing efforts outside of email.
If you have a little experience with social media marketing, you know that images are powerful. Facebook and Instagram posts with images get much more engagement on average than text-only posts. The same can be true for email.
A/B testing can help you understand where the line on images is for your audience.
Some ideas for A/B testing images in email include:
Whether or not to include a header image
How many images you include
What type of images you include — for example, a person versus a product
What style of image you include — for example, black and white versus colored or realistic versus painted
Now that you have plenty of ideas for A/B email testing, let’s look at a few tips for running the best split tests you can.
Start by thinking about the business goal you want to meet. For example, if you want to increase conversion rates, then you might want to work on optimizing your CTAs, as they directly relate to conversions. Subject lines impact open rates and images and email copy can improve engagement and positive brand affinity.
Once you start working on a specific variable, repeat the test across different emails. This lets you collect more data and normalize it. Otherwise, other factors could inadvertently impact your test.
You should also test with the right type of email. If you’re trying to figure out when the best time for cart abandonment emails is, testing with your monthly subscriber newsletter is pretty useless.
Select users randomly for tests and avoid using the same people for every test. Most businesses can test with around 20% of their list. However, if you only have a few hundred subscribers, 20% of that number won’t lead to statistically significant conclusions. In these cases, test with about 80% of your list.
It may seem simple, but the final step is actually running the test. As previously mentioned, between 30 and 50% of organizations don’t get to this step.
Be patient as you wait for results. If you’re looking at a metric like open rates, you may have a pretty good idea of performance in just a few hours — and you definitely know which variation was a winner within a day.
But other metrics, such as click-through and conversion rate, take longer to measure. That’s because someone may open your email and decide to come back to it later or think about your offer. For these types of metrics, you may need to let the test run for a few days to ensure you have a good sampling.
By leveraging A/B testing, you can support email marketing campaigns that perform better. Looking to learn more about this process? Reach out to our Tinuiti Lifecycle Marketing experts today.