A/B Testing In Email Marketing—What You Need To Know

Two letter boxes on a brown door labelled A and B

Going through a rough patch with your email campaigns? Tired of dealing with plummeting open rates and conversions? A/B testing in email marketing is the best way to identify what your target audience responds well to, refine and enhance your email campaigns, and improve overall revenue. In this guide, we’ll walk you through what A/B testing is, what you need to consider and how you can use it to build an effective email marketing campaign.


Crafting an email campaign

A lot goes into the creation of an effective email. You’ve got to come up with a persuasive subject line that actually encourages your target audience to open your email in the first place. Next, you’ve got to figure out what imagery you want to use, how much text is necessary and what format you want to follow. In short, you’ve got a lot of variables to juggle.

And the worst part? There’s no way of knowing what will work right off the bat. You might spend weeks working on an email, only to end up with dismal open rates and next-to-no conversions—sending you back to the drawing board, trying to figure out exactly what it was that didn’t work.

Enter: A/B testing…

A/B testing in email marketing

Gmail loading in browser on a laptopImage source: Solen Feyissa (via Unsplash)

In the context of email marketing, A/B testing, otherwise known as split testing, is the method of creating two versions of an email. One version will be sent to a sample subscriber group, and the other version to a different sample group.

The aim of this exercise is to figure out which email works best for your target audience. So, once you’ve sent them out, you’ll monitor the performance of both—focusing on metrics like opens and clicks—to determine which one comes out on top. The winning email will then be sent out to your full list of email subscribers.

How can I carry out A/B testing?

You can carry out A/B testing with most email campaign tools nowadays, from Campaign Monitor to MailChimp. You’ll simply create your two email variations, compile your sample groups, pick which metric decides the winner (whether that’s opens, clicks or conversions) and the campaign tool will reveal the winner after a specified window of time. Once a winner has been found, the campaign tool will send out the winning email to the rest of your subscriber list.

Using an email campaign tool that doesn’t offer A/B testing? Not to worry! You can sort it out manually. All you have to do is create your email variations, sort out your sample groups and then send out your emails. Once you’re ready, you can then compare the results of your emails manually.

Why is A/B testing so crucial?

Even after spending weeks poring over buyer personas and audience research, there’s no way of knowing what exactly will work. And remember, you’ve got a lot of variables to consider when crafting your email campaigns:

  • Subject lines
  • Personalisation
  • Headings
  • Images
  • Additional media
  • CTAs
  • Word count
  • Tone
  • Style
  • Formatting

With A/B testing, however, you can evaluate these variables to assess what will resonate with your target audience and compel them to take the next step in the marketing funnel. In time, you’ll be able to develop a winning email campaign that ticks every box your audience is looking for—driving up open rates, user engagement and conversions.

What needs to be considered with A/B testing

Group of blank yellow sticky notes on desk next to keyboardImage source: Kelly Sikkema (via Unsplash)

To ensure you get the most accurate results from your A/B testing in email marketing, there are a few factors that need to be carefully weighed up…

1. Sample size

If you’ve got a large list of email subscribers, an effective way of deciding the size of your sample groups is by following a ratio of 80/20. This means sending your email variants to 20% of your subscribers (10% for each email variant) and then the winning email to the remaining 80% of subscribers.

Of course, this is entirely dependent on how many subscribers you have in total. If you’ve got less than 1,000 subscribers, you’ll likely need to flip the ratio (sending the email variants to 80%) so that your findings have some statistic significance.

After all, the larger your sample subscriber group is, the more accurate your results will be.

2. Variables

Though the temptation is probably there to test multiple variables at once, you should try to limit yourself to one or two, so that you can figure out what actually works. If your two emails are completely different in every way, you won’t be able to accurately determine which variables have garnered the best results.

So, what are you testing?

Subject line

Arguably one of the most important variables in an email, the subject line is what will persuade your audience to open your email in the first place. If this is what you’re focusing on in your A/B test, try to mess around with:

  • Length: play around with the length of your subject line. Backlinko’s study found that subject lines between 36-50 characters get the best response rate.
  • Order: if you’re including a discount or a promotion, it might be more effective at the beginning than the end, e.g., “Get 20% off sitewide” vs. “Use this code to get 20% off”.
  • Personalisation: consider adding your customers’ names in the subject line to see whether it affects open rates—whether that’s their first name or a title and last name, e.g., “John” vs. “Mr. Smith”.
  • Emojis: try and experiment with emojis in your subject line to see whether or not your target audience responds well to them.

CTAs

The call to action(s) in your email is what will push your audience to take action. Whether that’s to head directly to your website to take advantage of a discount, to give your sales team a call or to read an article on your blog.

You can experiment with the wording of your CTAs—do you want to be slightly passive, or as direct as possible? Even the visual elements should be carefully considered, e.g., shape, size, button colour and text colour.

Obviously, if you’re going to do so, you’ll want to test one thing at a time. If you want to see whether a green or red CTA button would be more effective, focus on that for the time being. If you start messing around with the text, shape and size as well, you won’t know exactly what affects the performance.

Images

The right imagery can catch your audience’s eyes and drive conversions, whilst the wrong imagery can distract your audience from the actual content of your email. That’s why you need to carefully weigh up your choices.

Do you want to stick to images of people, or images of your product(s)? Are you planning to use one image as the focal point of your email, or do you want to scatter a few of them throughout the email?

Try to think about your colour palettes as well. Are you sticking to your brand colours, going for something bright, or opting for a black and white approach? You might even want to consider using more interactive media types like videos or GIFs.

Length

Another factor to consider is the length of your email. You need to figure out whether your target audience prefers short and snappy emails, or long, in-depth pieces.

The best way to do this is by creating a detailed email, duplicating it and then condensing it. You can then present both versions to your audience to see which one ends up with the highest conversion and open rates.

Though it will be time consuming to have to carry out multiple A/B tests to analyse each variable, it’s worth it in the long run.

3. Timing

It is absolutely vital that you send both email variants out at the same time. If you don’t, you’ll have no way of knowing whether the higher conversion/open rate for email A was due to the variable you were testing, or because it was sent out at a different time to email B.

Once you’ve sent your emails out, you need to decide how long you intend to wait until a winner is chosen. Ideally, you’ll want to give it at least 24 hours—the longer you wait, the more accurate and precise the results will be.

4. Delivery

If you’re using an email campaign tool for your A/B testing, you need to be aware that in most cases, your winning email will be automatically sent out to the rest of your subscribers as soon as your testing window is over. This means that you need to carefully consider when you’re carrying out your tests.

Let’s say you want your winning email to be sent out at 9am. If your testing window is 24 hours, then you’ll have to start your test at 9am the day before. If, however, your testing period is, say, 3 hours, you’ll need to start your test at 6am instead.

Obviously, this won’t be an issue if you’re carrying out your A/B tests manually. You’ll simply be able to send out the winning email whenever you want.

Why you should start A/B testing in email marketing

Just start text on MacBook Pro on deskImage source: Dayne Topkin (via Unsplash)

As we’ve highlighted, there are countless benefits to carrying out A/B testing in email marketing. Though it can be a laborious process—in that it will take you some time to work your way through all of the variables you want to test—it is one of the best ways to improve your email campaigns.

Every time you carry out an A/B test, you learn something new. Each email variant that wins or loses will tell you something vital about your target audience. These valuable insights will slowly but surely help you to craft email campaigns that will generate higher open rates, conversions and user engagement.

So, what are you waiting for?


And voilà! You now know what A/B testing is, why it’s so crucial to crafting your email campaigns and what you need to consider when you carry it out. For more tips and advice on email marketing, keep your eyes glued to the Supersede Media Blog!

Share