This article provides guidance on how to create, design, and send an A/B test using Spotler MailPro. A/B testing allows you to determine which version of your email performs best, whether it's testing subject lines, button colors, content, or calls to action.
In this article, you will learn more about:
Creating an A/B Test plan
Before starting your A/B test, there are several things you need to think about:
- What do you want to test (subject line, button color, email content, text in the button, etc.)?
- How many variants do you want to test?
- When should the rest batch be sent at the latest?
- What is the minimum size of the test group?
- How long do you want to test at minimum?
Designing an A/B Test
To design your A/B test, go to Module Content > Emails. Create an email message, then copy it to create a test variant. You can modify the subject line, header image, or call-to-action button, for example.
Sending an A/B Test
To send the A/B test, go to Module Send > Send mailing. Below, only the additional fields compared to a normal mailing are listed:
Important
The A/B test lacks the "test/definitive" option compared to a normal mailing. Despite the name "test," this is always a definitive sending.
Name of the A/B Test
This name is used only in MailPro, serving as a reference so you can find it later in the statistics.
Mailing A/ Mailing B/ Mailing C/ Mailing D
Here, you can select 2 to 4 mailings that you want to compare/test. A and B are mandatory, while C and D are optional.
Group size
After selecting the mailings, use the slider to adjust the group size for the sending. This will create 2 (or up to 4) equal test groups and the remaining group will receive the winner. The group size is shown in percentages, and the actual number depends on the total group size for the mailing.
Determine the winner
You can select the winner based on four criteria:
- Most (unique) opens – Which mailing (A/B/C/D) has the most unique opens? A unique open means that the email was viewed (completely or in the preview pane), and the images were downloaded. If images are not visible but the user clicks on any link, it is counted as an open.
- Most (unique) email renders – Which mailing (A/B/C/D) has the most unique renders? A unique render means that the email was viewed (completely or in the preview pane), and the images were downloaded. If the images were not downloaded, it will not be counted.
- Most (unique) clicks – Which mailing (A/B/C/D) has the most unique clicks? A unique click means the number of different people who clicked on one or more links.
- Most (unique) clicks on a specific link – Select the link that you want to test. Which mailing (A/B/C/D) had the most unique clicks on this specific link? Other links and opens do not influence the results. Please ensure that the link titles match in both mailings for the comparison to work.
- Most (unique) conversion hits – Select a conversion point. This can only be used if the conversion pixel is enabled (see another article for more details). This measure tracks not just the clicks but whether the visitor took the desired action on the landing page.
Open rates and renders indicate the effectiveness of the subject line. Click rates give insights into the content of the email, and conversions measure the action taken on the landing page.
How long to wait before sending the winner
The A/B test is an automatic process. This means you set it up once, and the mailing will be sent automatically. First, the A/B (C/D) test is sent to a portion of the group, and then the remaining group gets the winning version. The time to wait before sending the winner can be set between 2 hours and 120 hours.
To view the results of your A/B test visit the Reports module > A/B test.
FAQ
If there is no time for the automatic A/B test, you can still test multiple variants. Create the mailings, split the groups, and send the emails one by one to the corresponding group. The results can be found in the statistics and can be used for future mailings.
Yes, you can. However, you won’t be able to use the automatic A/B test in this case. Split the group into two or more groups and send the mailings at different times. The statistics for these mailings can be found and compared in the statistics section.
This is done automatically and randomly using the "Group size" option in the A/B test sending screen.
Common tips
A small group size can result in inaccurate data. Ensure that there are enough recipients to generate valid results.
Running the test for too short a time can lead to inconclusive results. Give your test enough time to collect meaningful data.
If you only change the content of the email and keep the inbox fields (subject line, sender name, etc.) the same, it will not impact the open rate. To test open rates effectively, the inbox fields, such as the subject line, must differ while keeping the content the same.