Marketing is a mix of art and science. Before the Internet, marketing was mostly seen as an art.
Have you seen Mad Men? Don Draper meditates on the beach and somehow comes up with an amazing idea that entices millions of people to buy Coke.
How did he know his idea would work?
The answer is that he had no idea. He got lucky.
You’ve also probably gotten lucky many times — making decisions without any backed-up factual reasoning other than that “it just feels right.”
Though this is totally fine, your gut might not work best for all your decisions. Especially when it comes to email marketing.
You might love that emoji in the subject line, but John from Oakland is a lot more tempted to click the emoji-less version. Different strokes for different folks!
The only way to find out what emails appeal to the majority is to put it to the test.
A/B split testing is the only way to statistically prove which email campaign brings you the most success. It’s also the fastest way to figure out what your audience likes (and optimize your email campaigns accordingly).
If you want to book real results with your email campaigns, you have to use A/B tests.
Not sure where to start? You’re not alone.
Many email marketers aren’t familiar with A/B testing yet. It’s time to remove the mystery around A/B testing and show you how easy it is to get the most out of your email marketing.
We’re about to teach you all about email A/B testing. From what it is, to how and when to use it and what to test. It’s easier than you think. Heck, after reading this guide you’ll wonder why you haven’t done it before.
Most people skip email A/B testing in marketing because they don’t know how or what to test. If this is you, read on. You’ll discover a huge opportunity to improve your campaigns.
A/B split testing is a way of evaluating and comparing two things.
Smart marketers do this because they want to know:
With email marketing A/B tests you can improve your metrics, increase conversions, get to know your audience and find out what’s generating sales.
And the testing part itself is a breeze.
In your email marketing tool, you simply set up 2 emails that are exactly the same except for 1 variable, such as a different subject line. You then send the 2 emails to a small sample of your subscribers to see which email is more effective.
Half of your test group receives Email A and the other half gets Email B. The winner is determined by what you are trying to measure. For instance, if you want to know which version attracts more people to open your emails, you use the open rate as your success metric.
Setting up your A/B test email campaign is easy in MailerLite. Pick exactly what you want to test, create 2 (or multiple) versions, choose your best sample size for each variation and off you go.
While the setup is straightforward, there are a few details in each step that are important to ensure accurate results. Let’s take a look.
When you test 2 subject lines, the open rate will show which one of your subjects appealed most to your subscribers. When you test 2 different product images in your email layout, you’ll want to look at both the click-through rate (and conversions).
It can happen that two emails show different results, depending on what you look for. In the email underneath, the plain-text version had a better open rate but when it came to people clicking, the design template was more successful.
Why? Because the design version contained the video as a GIF — which attracted more people to click.
When you have a big email list (over 1000 subscribers), we recommend sticking to the 80/20 rule (also known as the Pareto principle).
Meaning, focus on the 20% that will bring you 80% of the results. When it comes to A/B tests, this means sending one variant to 10% of the people, and the other 10% to variant B. Depending on which variant performed best, the rest of the 80% will be sent to the remaining group of subscribers.
The reason why we recommend this principle for bigger lists is because you want more statistically significant and accurate results. The 10% sample size for each variant needs to contain enough subscribers to show which version had more impact.
When you’re working with a smaller list of subscribers, the percentage of subscribers that you want to A/B test will get increasingly larger in order for you to get statistically significant results. If you have less than 1,000 subscribers, you probably want to test 80-95% and send the winner version to only the small remaining percentage.
After all, if 12 people click on a button in email A and 16 people do so in option B, you can’t really tell which button performs better. Make your sample size large enough to get statistically significant results.
Using the Evan Miller sample size calculator
You can calculate the right sample size for your A/B test using the Evan Miller sample size calculator. Let’s see what it looks like:
As shown above, this calculator answers the question “How many subjects are needed for an A/B test?”
In this guide, we won’t go into all the technical details, but we will teach you the main points you need to know to be able to understand how to use the calculator for your own A/B test.
The MDE will entirely depend on whether you want to be able to detect either small or big changes from your current conversion rate. You’ll need less data (or smaller sample size) to detect big changes and more data if you want to detect small changes.
If you want to detect small changes, you need to set the MDE lower (for example on 1%).
To detect larger changes, the MDE percentage will be higher. Be careful and don’t set it too high though. A higher MDE means that you won't be able to tell if your "A to B" change made a difference or not.
The Evan Miller calculator shows you how big your sample should be. In the example above each variation should include 1030 email subscribers.
When using the calculator you have to keep in mind that it’s a tradeoff between 2 things:
In the example below we ran an A/B test where each group contained around 23,300 subscribers. You’ll see that version A had a better open rate. From this test we’ve learned that our customers react better to the words “new feature” versus “introducing.”
When do you normally open an email? Your answer is probably: it depends.
You might be online, see the email coming in and click within 5 minutes. Or you might first see the newsletter 2 hours after it got delivered in your mailbox. Or perhaps the subject line didn’t grab you enough and you leave the email unopened.
These are all real scenarios. Which is why you should have an adequate time window when running an A/B test.
While with variables like subject lines and opens you can send the winner as early as 2 hours after sending, you might want to wait for a longer time if you’re measuring click-throughs. When you’re testing your newsletter on active subscribers, you can shorten the waiting time.
Research has shown that when you wait 2 hours, the accuracy of the test will be around 80%. The longer time you add to those hours, the more accurate your results will be. To hit accuracy of 99%, it’s best to wait 1 entire day.
Be aware that a longer waiting time is not always better. Some newsletters are time-sensitive and should be sent asap. Or it can happen that waiting too long will result in the winning email being sent at the weekend. A weekday versus a Saturday or Sunday can make a lot of difference in your email stats.
The main rule when it comes to defining the right waiting time is: Every business is different so it's essential to monitor your metrics and continue to test.
Keep in mind that the winning email is automatically sent once the testing period is completed. As this group likely contains the most subscribers, it’s a good idea to schedule the delivery time to reach these people.
Let’s say you’re testing 2 subject lines on 20% of your subscribers (each group contains 10%). You want the winning newsletter to arrive in people’s inboxes at 10 AM and you want to test the open rate for 2 hours.
This means you have to start your test at 8 AM, so your A/B test can run for 2 hours before the winning variant is sent out at 10 AM.
Imagine you’re sending two emails at the same time. The content and sender’s name are identical. The only thing that differs is the subject line. After a few hours, you see that version A has a much better open rate.
When you only test 1 thing at a time and you see a clear difference in the metric you’re analyzing, you can draw an accurate conclusion. However, if you had also changed the sender’s name, it would be impossible to conclude that the subject line made all the difference.
Are you wondering if your opens will improve by changing the subject line versus the preheader?
To find out your winning combo, you should run two separate tests.
For example, you could A/B test:
Wondering how to set up an A/B test in MailerLite? Here’s a step-by-step tutorial video to show you the way:
Since you can literally test any variable within your email campaign, it can seem overwhelming figuring out where to start.
Will you A/B test the subject line? The color of a call to action button? The sender’s name?
Each of those things is likely to have an effect on different parts of the conversion process.
The main thing to remember is: don’t stress! You’ll have a new opportunity to test an element with each email you send. Here are the best 5 elements to get started with.
An effective subject line can make all the difference in a successful campaign. Testing subject lines is a good way to start optimizing your emails. MailerLite makes it easy to test subject lines with a simple click.
Things to test in your subject line can be:
When you are on the A/B test page, select Email subject and type in the 2 subject lines you want to test.
Not sure what subject lines work best? Here are some approaches that you can A/B test.
Different email research has shown that asking questions can have a very positive effect on open rates. Questions feel incomplete on their own. Using a question will inspire readers to open your email in search of an answer.
Questions starting with ‘Did you know . . .’ and ‘Do you want . . .’ are great ways to catch your readers’ attention and keep them reading. An A/B test helps to see if this method also works for your audience.
[A] 3 ways to boost your productivity
[B] Do you know the 3 tricks that’ll boost your productivity?
Using your reader’s name or location in the subject line adds a feeling of connection, especially when it’s a name. Additionally, it can increase your click-through rates.
Personalization is an effective tactic, but don’t use it every time you send an email. It’s all about the surprise factor. If newsletter subscribers see their name on every subject line, it loses its sparkle.
[A] 20% off sale!
[B] Janet, we’re giving you 20% off this week
Using non-standard characters like ☞ or statistics in the subject line is an easy way to stop a reader’s wandering eye and capture attention. Just make sure you are using these non-standard characters in a relevant way. The subject line should always relate to the content within your email.
Furthermore, you can experiment with emojis in your subject line. Emojis have different effects on different target audiences but can improve your email stats when used correctly.
[A] Activate your account to get started
[B] 💥 Activate your account to get started
Fear of loss is a stronger motivator than the desire to gain. We all hate missing out. Craft your offer so it feels like your audience will lose something if they do not open the email.
Subject lines that create a sense of urgency or exclusivity can result in 22% higher open rates. Using phrases like “don’t miss out,” “today only” or “24-hour giveaway” in your subject lines will encourage your readers to act immediately.
[A] Get your free sample box
[B] Last day to get your FREE sample box
In our experience, the ‘From’ field is even more important than your subject line. If you have a personal connection with your audience, they will open your email regardless of the subject line.
Case in point: you wouldn’t care about the subject line when you get an email from your mum. We usually open and read emails that are sent by someone we trust and know.
Here are some ways you can change the ‘From’ field and A/B test it.
If your campaign comes from a company, experiment with using your company name versus inserting the name of one of your employees (e.g. the marketing person or CEO/founder).
This tactic can result in higher open rates, though it depends on your target audience. Testing different variables is the key to discovering what advice you should be following.
We chose to stick with MailerLite, as there are too many contributors and our customers will get lost in all the names. Within the email, we do show who was responsible for the content.
It’s important here that after you’ve tested your favorite sender name, you stick with it. This way your email subscribers can easily start recognizing your emails.
[B] Ilma from MailerLite
If you’re a blogger and your name is your brand, you can test whether just your first name or your entire name works best.
[B] Seth Godin
If you’re sending emails from info at yourcompany.com, you can test whether readers are more willing to open your emails when it’s sent from an actual person working at your firm.
Testing the content of your newsletter can be tricky because if you’re simply changing the text, it’s hard to identify the one variable that causes a conversion. One aspect of your content that can be reliably tested is the CTA, short for call to action.
The CTA is the most important part of securing clicks. It’s the final gateway before a reader converts.
Here are some aspects of CTAs you can change and test.
Including too many links is overwhelming – but having just 2 or even 3 links pointing to the same ultimate goal generally leads to a lift in conversions rather than a drop-off. CTAs are best placed on a clickable button. Try to repeat your CTA in your signature or postscript (P.S.) – you’ll be surprised by the results.
Text on buttons
Try shorter and longer text versions on buttons. Test out a typical CTA versus a creative one. Play around with text to see which word or phrase converts more visitors.
Play around with different sizes, fonts and even ALL CAPS
For some reason, we found that buttons with all caps perform the best. See if it works for you.
Different colors for buttons
Use contrasting colors. Some marketers say that red increases click-through rate, though the color in your email needs to fit the message.
Try varying the location of your CTA button
For example by making some CTAs more prominent than others. It can make a big difference when the button is placed above the fold so readers can immediately see it without having to scroll.
Consider using arrows (→) and other visual elements to guide the reader
Some of the most successful CTAs out there have arrows pointing at them. It creates a sense of direction and guides the visitor to an important element on the page. This is a way of prioritizing information and creating a flow.
Image is a powerful tool to convince your customers to act. Experimenting with images is a fun way to get the pulse of your readers to see what types of images they respond to and how your images can drive engagement.
Here are some ideas of what images you can test:
When customers receive your email, the subject and preheader text will be the key elements they use to determine if opening and interacting with your email is worth their time.
The preheader text is like a continuation of a subject line, so you can test it in the same ways as a subject line: ask a question, create a sense of urgency and so on.
Don’t repeat the same information that’s already in your subject line. Instead, use the space to elaborate on the subject line.
As you’ve learned, there is nothing mysterious or complicated about A/B testing. In fact, email marketing is much harder without A/B testing. Your campaigns will not improve without learning what works and what doesn’t.
With MailerLite, it’s super easy to set up an A/B test. Start small and try testing your subject line to improve your open rates. Once you get the hang of it, you can go on and try one of the A/B tests above.
The only challenging thing about A/B testing is that you’re never really done. There is no end to what can be tested and what knowledge can be gleaned from testing. If something works well in a few A/B split tests – keep doing it, and move on to test another aspect of your email. Also remember, what works today will not necessarily work tomorrow.
Ready to have fun? Start A/B testing today to improve your email conversions tomorrow.