31 Ways To A/B Test Your Emails and Boost Your Performance
What’s the difference between a bad marketer, a good marketer, and a great marketer?
A bad marketer says: We’ve always done things this way. Let’s just stick with it… there’s no need to rock the boat.
A good marketer says: We’ve always done things this way, but let’s switch things up.
A great marketer says: We’ve always done things this way, and while I can’t be certain, I think our customers would prefer a new approach. Let’s A/B test both methods, then see which one performs better.
Moral of the story? As a marketer, you shouldn’t let your personal opinions, judgements, and biases get in the way. Instead, you want to make your decisions based on data. This way, you’ll definitely get it right.
So, if you’re trying to improve your email campaigns, don’t assume that your customers will prefer receiving newsletters with more images, or emails that are personalized with their first name.
Go ahead and test it out… and let the results speak for themselves!
In this blog post, we discuss how you can A/B test properly, and share 31 A/B email testing ideas for more inspiration.
Alright, let’s jump right in!
How To A/B Test Emails Accurately
Now, you might think that A/B testing is a relatively straightforward process… but there are actually plenty of areas where you can go wrong.
Want to make sure your test results are accurate? Then follow the tips that we’ve outlined here:
1. Only change one variable at a time
We know, we know, this is so obvious that it almost insults your intelligence, but we thought we’d throw it in here for good measure anyway.
Basically, the whole premise of A/B testing is that you’re only changing one variable at a time.
If you change, say, two variables in a single test, then you won’t know whether you should attribute our results to variable A or variable B… which ultimately defeats the purpose of your test in the first place.
2. Only test when you have enough contacts
How many contacts count as “enough”?
Well, there’s no hard and fast rule here – it really depends on your engagement rates.
At the end of the day, you’re aiming to have a decent number of click throughs, so that you know your results are statistically significant. Personally, we’d try and shoot for at least 50 clicks per email variant.
Now, assuming your open rate is 10% and your click through rate is 2%, let’s walk backward, and do the math:
2% of readers clicking through → 50 clicks.
100% of readers → 2,500 readers who opened.
10% of readers → 2,500 readers who opened.
100% of readers → 25,000 contacts (per variant).
25,000 contacts per variant → 50,000 contacts so that you can send 2 variants.
If your engagement rates are higher, this allows you to achieve the same results with fewer test subjects. For instance, assume that your open rate is 20% and your click through rate is 5%.
Going through the same calculations, in this scenario, you’ll need just 10,000 contacts in your list to get the same 50 clicks per variant.
So: if your list isn’t big enough, don’t test yet, and focus on building up your base of contacts first. Conducting a test too early might lead you to arrive at the wrong conclusion!
3. Don’t evaluate the results of your test too early
Some folks check their email several times a day; others might only log in once a week.
Keeping this in mind, you don’t want to evaluate the results of your test too early. This will skew your results, and again, lead you to arrive at the wrong conclusion.
So, how long should you wait?
We recommend a minimum of five days.
4. Don’t compare split tests across time periods
Say you run an eCommerce store, and you want to improve upon your product recommendation newsletters. To do this, you conduct two split tests over a period of two weeks.
Variant A: Call To Action displayed as a pink button (Winning variant)
Variant B: Call To Action displayed as text
Variant A: CTA displayed as a pink button (Winning variant)
Variant B: CTA displayed as a green button
Now, if you extrapolate from the above results, does this mean that a green CTA button will definitely perform better than CTA displayed as text?
Nope, not quite. Since you’re running the tests over different periods of time, you can’t necessarily compare the results across the tests.
For all we know, the email campaign sent out during Week 2 might have coincided with the Black Friday weekend, when shoppers were in the mood to shop.
This is a bit of an extreme example, but you get the idea!
31 A/B Testing Ideas For Emails
Alright, that’s all we’ve got for you on the topic of running A/B tests properly.
For more inspiration on testing your email campaigns, check out this list of 31 email A/B testing ideas that you can use:
1. Headline copy
2. Headline length
3. Body copy: soft sell versus hard sell
4. Body copy: emotive vs logical
5. Body copy: positive vs negative framing
6. Body copy: conversational vs formal
7. Body copy length
8. Discount variations: 10% off, $10 off, $20 (UP $30)
9. Bullet points
Email subject line
10. Capitalization: lower case, casual case, title case
11. Personalizing with names
12. Asking a question
13. Clear vs ambiguous
14. Sender name: [Name] vs [Company] vs [Name from Company]
Images and design
15. Number of product images featured
16. Product image sizes
17. Product image variations
18. Color scheme
19. Price ribbons
Call To Actions
22. Single CTA vs multiple CTAs
23. Button CTAs vs text CTAs
24. Button CTAs: size of button
25. Button CTAs: color of button
26. Location of CTAs
27. Send time and date
28. Countdown timer
29. Listicle format
30. Trust badges and icons
31. Product testimonials and ratings
A/B Test Your Way To Success
The way we see it, email campaigns are always a work-in-progress.
Yes, you might already be running a highly profitable campaign… but that doesn’t mean that you can’t fine-tune the campaign, and make it even better. That’s where A/B testing comes in.
By continually testing the different elements of your campaign, you’ll keep improving your results, bringing you more conversions, sales, and money in the bank.
Alright – less talking, more A/B testing.
If you’ve got any questions, be sure to ping us in the comments below!