No results were found...

Ultimate Guide To Email Marketing

Email marketing A/B testing made simple

Marketing is a mix of art and science. Before the Internet, marketing was mostly seen as an art.

Have you seen Mad Men? Don Draper meditates on the beach and somehow comes up with an amazing idea that entices millions of people to buy Coke.

How did he know his idea would work? 

The answer is that he had no idea. He got lucky.

You’ve also probably gotten lucky many times—making decisions without any backed-up factual reasoning other than that “it just feels right.”

Though this is totally fine, your gut might not work best for all your decisions. Especially when it comes to email marketing.

You might love that emoji in the subject line, but John from Oakland is a lot more tempted to click the emoji-less version. Different strokes for different folks!

The only way to find out what emails appeal to the majority is to put it to the test.


Email A/B split testing is the only way to statistically prove which email campaign brings you the most success. It’s also the fastest way to figure out what your audience likes (and optimize your email campaigns accordingly).

If you want to book real results with your email campaigns, you have to use A/B tests.

Not sure where to start? You’re not alone. 

Many email marketers aren’t familiar with A/B testing yet. It’s time to remove the mystery around A/B testing and show you how easy it is to get the most out of your email marketing.

We’re about to teach you all about email A/B testing. From what it is, to how and when to use it and what to test. It’s easier than you think. Heck, after reading this guide you’ll wonder why you haven’t done it before.

email AB test optimized

Email A/B testing, also known as email split testing, is sending 2 different versions of your email to 2 different sample groups of your email list. The email which receives the most opens and clicks (aka “the winning version”) will be sent out to the rest of your subscribers.

Most people skip email A/B testing in marketing because they don’t know how or what to test. If this is you, read on. It is easier than you think and you’ll discover a huge opportunity to improve your campaigns.

A/B split testing is just a way of evaluating and comparing two things. 

Smart marketers do this because they want to know:

  • Which subject line has the best open rates

  • Whether their target audience is more drawn to emojis or not

  • Which button text makes people most eager to click

  • What imagery in your email drives better conversions

  • What preheader text generates the best open rate

  • Etcetera—there’s so much to discover!

With email marketing A/B tests you can improve your metrics, increase conversions, get to know your audience and find out what’s generating sales.

And the testing part itself is a breeze.

In your email marketing tool, you simply set up 2 emails that are exactly the same except for 1 variable, such as a different subject line. You then send the 2 emails to a small sample of your subscribers to see which email is more effective.

Half of your test group receives Email A and the other half gets Email B. The winner is determined by what you are trying to measure. For instance, if you want to know which version attracts more people to open your emails, you use the open rate as your success metric. Let's say Version B gets higher open rates. Then it will be sent automatically to the rest of your subscribers, because it statistically performs better. It could even become an email template for future marketing campaigns!


Setting up your A/B test email campaign is easy in MailerLite. Pick exactly what you want to test, create 2 (or multiple) versions, choose your best sample size for each variation and off you go.

While the setup is straightforward, there are a few details in each step that are important to ensure accurate results. Let’s take a look. 

First, decide which variable to test

When you test 2 subject lines, the open rate will show which one of your subjects appealed most to your subscribers. When you test 2 different product images in your email layout, you’ll want to look at both the click-through rate (and conversions). 

It can happen that two emails show different results, depending on what you look for. In the email underneath, the plain-text version had a better open rate but when it came to people clicking, the design template was more successful. 

Why? Because the design version contained the video as a GIF—which attracted more people to click.

AB testing the subject line

A second thing to look at is: How do you pick the correct sample size? 

When you have a big email list (over 1000 subscribers), we recommend sticking to the 80/20 rule (also known as the Pareto principle). 

Meaning, focus on the 20% that will bring you 80% of the results. When it comes to A/B tests, this means sending one variant to 10% of the people, and the other 10% to variant B. Depending on which variant performed best, the rest of the 80% will be sent to the remaining group of subscribers.

AB test group sample example

The reason why we recommend this principle for bigger lists is because you want more statistically significant and accurate results. The 10% sample size for each variant needs to contain enough subscribers to show which version had more impact.

When you’re working with a smaller list of subscribers, the percentage of subscribers that you want to A/B test will get increasingly larger in order for you to get statistically significant results. If you have less than 1,000 subscribers, you probably want to test 80-95% and send the winner version to only the small remaining percentage.

After all, if 12 people click on a button in email A and 16 people do so in option B, you can’t really tell which button performs better. Make your sample size large enough to get statistically significant results.

Using the Evan Miller sample size calculator

You can calculate the right sample size for your A/B test using the Evan Miller sample size calculator. Let’s see what it looks like:

Evan Miller sample size calculator

As shown above, this calculator answers the question “How many subjects are needed for an A/B test?” 

In this guide, we won’t go into all the technical details, but we will teach you the main points you need to know to be able to understand how to use the calculator for your own A/B test.

  • Sample size: This is the result we’re looking for when using this calculator

  • Baseline conversion rate (BCR): This is your current conversion rate

  • Minimum Detectable Effect (MDE): This is the smallest effect that will be detected in your test

The MDE will entirely depend on whether you want to be able to detect either small or big changes from your current conversion rate. You’ll need less data (or a smaller sample size) to detect big changes and more data if you want to detect small changes. 

If you want to detect small changes, you need to set the MDE lower (for example on 1%). 

To detect larger changes, the MDE percentage will be higher. Be careful and don’t set it too high though. A higher MDE means that you won't be able to tell if your "A to B" change made a difference or not. 

The Evan Miller calculator shows you how big your sample should be. In the example above each variation should include 1030 email subscribers.

When using the calculator you have to keep in mind that it’s a tradeoff between 2 things:

  1. Sample size: How much data you need to collect

  2. Statistical difference: How well you want to be able to discern whether group A or group B had better results

In the example below we ran an A/B test where each group contained around 23,300 subscribers. You’ll see that version A had a better open rate. From this test we’ve learned that our customers react better to the words “new feature” versus “introducing.”

AB test email open rates

A third factor to take into account is the timing window.

When do you normally open an email? Your answer is probably: it depends.

You might be online, see the email coming in and click within 5 minutes. Or you might first see the newsletter 2 hours after it got delivered in your mailbox. Or perhaps the subject line didn’t grab you enough and you leave the email unopened.

These are all real scenarios, which is why you should have an adequate time window when running an A/B test. 

While with variables like subject lines and opens you can send the winner as early as 2 hours after sending, you might want to wait for a longer time if you’re measuring click-throughs. When you’re testing your newsletter on active subscribers, you can shorten the waiting time.

Research has shown that when you wait 2 hours, the accuracy of the test will be around 80%. The longer time you add to those hours, the more accurate your results will be. To hit an accuracy of 99%, it’s best to wait for an entire day.

Be aware that a longer waiting time is not always better. Some newsletters are time-sensitive and should be sent asap. In other situations, waiting too long will result in the winning email being sent at the weekend. A weekday versus a Saturday or Sunday can make a lot of difference in your email stats (check out this article if you’re wondering when is the best time to send your email).

The main rule when it comes to defining the right send time optimization is: Every business is different so it's essential to monitor your metrics and continue to test.

Your fourth factor is the delivery time.

Keep in mind that the winning email is automatically sent once the testing period is completed. As this group likely contains the most subscribers, it’s a good idea to schedule the email automation to reach these people.

Let’s say you’re testing 2 subject lines on 20% of your subscribers (each group contains 10%). You want the winning newsletter to arrive in people’s inboxes at 10 AM and you want to test the open rate for 2 hours.

This means you have to start your test at 8 AM, so your A/B test can run for 2 hours before the winning variant is sent out at 10 AM.

Finally, the fifth factor is to test only one variable at a time.

Imagine you’re sending two emails at the same time. The content and sender’s name are identical. The only thing that differs is the subject line. After a few hours, you see that version A has a much better open rate.

When you only test 1 thing at a time and you see a clear difference in the metric you’re analyzing, you can draw an accurate conclusion. However, if you had also changed the sender’s name, it would be impossible to conclude that the subject line made all the difference.

Example of an A/B split test

Are you wondering if your opens will improve by changing the subject line versus the preheader?

To find out your winning combo, you should run two separate tests.

  1. First, test two different subject lines with the same preheader

  2. Conduct a second test using the winning subject line with two different preheader texts

  3. Once you’ve tested both variables on their own, you can combine the winning subject line and preheader for optimal results

What email variables could I test (and how to do this)? 

For example, you could A/B test:

  • Subject line

  • Personalization (using the sender’s name or subscriber's name)

  • Images

  • Email design

  • Email layout

  • Preview text

  • CTAs

  • Different testimonials

  • Links and buttons

  • Copywriting (length, word order, tone)

  • Headline text

  • Closing text

  • Offers types

Wondering how to set up an A/B test in MailerLite? Here’s a step-by-step tutorial video to show you the way:


Since you can literally test any variable within your email campaign, it can seem overwhelming figuring out where to start. 

Will you A/B test the subject line? The color of a call to action button? The sender’s name? 

Each of those things is likely to have an effect on different parts of the conversion process.

The main thing to remember is: don’t stress! You’ll have a new opportunity to test an element with each email you send. Here are the best 5 elements to get started with.

1. Test email subject lines

An effective subject line can make all the difference in a successful campaign. Testing subject lines is a good way to start optimizing your emails. MailerLite makes it easy to test subject lines with a simple click.

Things to test in your subject line can be:

  • Question

  • Number

  • Emoji

  • Capitalization

  • Ambiguous versus specific

  • First person versus second person

  • Personalization

  • Length

  • Urgency

When you are on the A/B test page, select Email subject and type in the 2 subject lines you want to test.

Email subject line A/B test

Not sure what subject lines work best? Here are some approaches that you can A/B test.

Ask a question in your subject line

Different email research has shown that asking questions can have a very positive effect on open rates. Questions feel incomplete on their own. Using a question will inspire readers to open your email in search of an answer.

Questions starting with ‘Did you know . . .’ and ‘Do you want . . .’ are great ways to catch your readers’ attention and keep them reading. An A/B test helps to see if this method also works for your audience.

A subject line with a question example

[A] 3 ways to boost your productivity

[B] Do you know the 3 tricks that’ll boost your productivity?

Make your subject line more personal

Using your reader’s name or location in the subject line adds a feeling of connection, especially when it’s a name. Additionally, it can increase your click-through rates. 

Personalization is an effective tactic, but don’t use it every time you send an email. It’s all about the surprise factor. If newsletter subscribers see their name on every subject line, it loses its sparkle.

A name-based subject line example

[A] 20% off sale!

[B] Janet, we’re giving you 20% off this week

Use symbols or numbers

Using non-standard characters like ☞ or statistics in the subject line is an easy way to stop a reader’s wandering eye and capture attention. Just make sure you are using these non-standard characters in a relevant way. The subject line should always relate to the content within your email.

Furthermore, you can experiment with emojis in your subject line. Emojis have different effects on different target audiences but can improve your email stats when used correctly.

An emoji-influenced subject line example

[A] Activate your account to get started

[B] 💥 Activate your account to get started

Create a sense of urgency

Fear of loss is a stronger motivator than the desire to gain. We all hate missing out. Craft your offer so it feels like your audience will lose something if they do not open the email.

Subject lines that create a sense of urgency or exclusivity can result in 22% higher open rates. Using phrases like “don’t miss out,” “today only” or “24-hour giveaway” in your subject lines will encourage your readers to act immediately.

A subject line example that expresses urgency

[A] Get your free sample box

[B] Last day to get your FREE sample box

2. Test sender’s name

Sender name ab testing

In our experience, the ‘From’ field is even more important than your subject line. If you have a personal connection with your audience, they will open your email regardless of the subject line.

Case in point: you wouldn’t care about the subject line when you get an email from your mum. We usually open and read emails that are sent by someone we trust and know.

Here are some ways you can change the ‘From’ field and A/B test it.

A real person versus a company name

If your campaign comes from a company, experiment with using your company name versus inserting the name of one of your employees (e.g. the marketing person or CEO/founder).

This tactic can result in higher open rates, though it depends on your target audience. Testing different variables is the key to discovering what advice you should be following.

We chose to stick with MailerLite, as there are too many contributors and our customers will get lost in all the names. Within the email, we do show who was responsible for the content.

It’s important here that after you’ve tested your favorite sender name, you stick with it. This way your email subscribers can easily start recognizing your emails.

Employee test

[A] MailerLite

[B] Ilma from MailerLite

Your full name versus the first name only

If you’re a blogger and your name is your brand, you can test whether just your first name or your entire name works best.

Full name test

[A] Seth

[B] Seth Godin

Different email addresses

If you’re sending emails from info at yourcompany.com, you can test whether readers are more willing to open your emails when it’s sent from an actual person working at your firm.

Email address test

[A] info@mailerlite.com

[B] ilma@mailerlite.com

3. Test email content

a/b test email content

Testing the content of your newsletter can be tricky because if you’re simply changing the text, it’s hard to identify the one variable that causes a conversion. One aspect of your content that can be reliably tested is the CTA, short for call to action.

The CTA is the most important part of securing clicks. It’s the final gateway before a reader converts.

Here are some aspects of CTAs you can change and test.

Repeated CTAs

Including too many links is overwhelming – but having just 2 or even 3 links pointing to the same ultimate goal generally leads to a lift in conversions rather than a drop-off. CTAs are best placed on a clickable button. Try to repeat your CTA in your signature or postscript (P.S.) – you’ll be surprised by the results.

Text on buttons

Try shorter and longer text versions on buttons. Test out a typical CTA versus a creative one. Play around with text to see which word or phrase converts more visitors.

CTA text ideas

[A]

  • Buy Now

  • Purchase

  • Add to Cart

[B]

  • I’m in!

  • Let’s do this!

  • Gimme this

Play around with different sizes, fonts and even ALL CAPS

For some reason, we found that buttons with all caps perform the best. See if it works for you.

Different colors for buttons

Use contrasting colors. Some marketers say that red increases click-through rate, though the color in your email needs to fit with your marketing campaign.

Try varying the location of your CTA button

For example by making some CTAs more prominent than others. It can make a big difference when the button is placed above the fold so readers can immediately see it without having to scroll.

Consider using arrows (→) and other visual elements to guide the reader

Some of the most successful CTAs out there have arrows pointing at them. It creates a sense of direction and guides the visitor to an important element on the page. This is a way of prioritizing information and creating a flow.

4. Test images (or no images)

A/B test for images in emails

Image is a powerful tool to convince your customers to act. Experimenting with images is a fun way to get the pulse of your readers to see what types of images they respond to and how your images can drive engagement. 

Here are some ideas of what images you can test:

  • Image of people versus a product is a good place to start

  • One versus multiple images

  • Text on image versus no text

  • Screenshot of a video

  • Animated GIF versus a static image

  • Serious straightforward image against a goofy one

  • Colorful visuals versus black and white visuals

  • A stock image stacks versus an image of your employees or customers in action

Want to know all about images in emails?

Read our guide to using images in email newsletters, where we go in-depth with best practices and examples.

5. Test your preheader text

When customers receive your email, the subject and preheader text will be the key elements they use to determine if opening and interacting with your email is worth their time.

The preheader text is like a continuation of a subject line, so you can test it in the same way as a subject line: ask a question, create a sense of urgency and so on.

Don’t repeat the same information that’s already in your subject line. Instead, use the space to elaborate on the subject line.

Preheader a/b testing in emails
💡Want to A/B test your workflow emails?

You can now implement A/B tests into your automation workflows. Whether you just want to optimize your welcome email or test every message in a complex onboarding sequence, A/B testing for automation will help you fine-tune your email automations and clear a path to email marketing success.


The fun doesn’t stop with emails! Did you know that you can also A/B test your landing pages with MailerLite? Just like with your emails, you can test different versions of your landing pages to find out which one is most effective and make some tweaks accordingly. 

You can test lots of different elements in your landing pages, including:

  • Product descriptions: Find out what resonates most with your page visitors

  • CTA: Test elements like the color scheme, positioning, sizing and copy

  • Headline and copy: Learn what messaging triggers the most engagement

  • Images and videos: Check whether your page visitors prefer minimalist pages, or louder ones with lots of graphics and videos

With MailerLite, you can test up to 5 different content combinations by splitting traffic between the landing pages, and see the results in your dashboard. So I guess it is really more of an A,B,C,D,E test. 


As you’ve learned, there is nothing mysterious or complicated about A/B testing. In fact, email marketing is much harder without A/B testing. Your campaigns will not improve without learning what works and what doesn’t.

With MailerLite, it’s super easy to set up an A/B test. Start small and try testing your subject line to improve your open rates. Once you get the hang of it, you can go on and try one of the A/B tests above.

The only challenging thing about A/B testing is that you’re never really done. There is no end to what can be tested and what knowledge can be gleaned from testing. If something works well in a few A/B split tests—keep doing it, and move on to test another aspect of your email. Also remember, what works today will not necessarily work tomorrow.

Ready to have fun? Start A/B testing today to improve your email and landing page conversions tomorrow.