Did Your Customer Pass The A B test?
Whether you run a business, organization or nonprofit, you all flourish with repeat customers. We all know email campaigns are the best cost-effective way to cultivate your list members into repeat customers while attracting new ones. Do you believe you are getting the most out of your mailings? Are you satisfied with the outcome of your campaigns? Do you always have an increase in ROI, conversions or a higher subscriber rate, because you test your emails, or are you one of the many senders that don’t test, and just hope that the results will be better every time you mail?
For those who take a back seat to testing, it’s been shown, list owners who employ email A/B testing hit their goals more times than those who don’t.
A/B testing is easy, the hardest part may be deciding what to test; So how do you decide what to test? How do you measure the results and how much trial and error should you expect: because you cannot test everything at once if you expect to obtain accurate test results.
To start, A/B testing is a method of comparing different versions of the parts of an email against each other to determine which one performs better. A hypothesis can be generated based on how you could improve your campaigns, set up your test and send it out. In theory, this testing method gives the user solid evidence of what performs well and what does not. Through testing, you can get a clear idea of your customer’s preferences, which will help you create the type of email they will read and act on.
Decide what to test:
- Subject Line
- Type of Personalization
- Length of email
- Landing page
- The time you send out your email
Each and every item on this sample list is likely to have a different consequence on your conversion process. Change your Subject Line to test different open rates, – If you’re testing for advertiser click-throughs you might change the position of advertiser links within your email and so on.
When you decide if you are testing for advertiser clicks or opens, test what you consider important first. If more opens are a concern to you, testing the Subject Line might be number one on your list.
Decide who to test
Should you test your entire list by splitting it in half or segment your list into smaller groups. If you segment into smaller members, select random email addresses unless you’re testing something in the demographics area, like people in a particular zip code or women only. You never want to individually select an email address to place in a segment when you are doing A/B testing as using hand-picked selected email will distort the results.
Segments work well when you are testing for conversions based on a time limit, such as a sale. You can test what you need to with a few hundred recipients, use the best response to your tested email and then send to your entire list.
After you decide what to test for and who to test, the results may not be what you expect. Statistics from previous campaigns using the same format, where you made subtle changes when testing, will be most helpful when you measure success. Success, however, depends on what your testing goal is: more opens, call to actions, increased conversions….etc.
Don’t Make it Hard On Yourself
Dundee hosted ListManager has built-in tools for A/B testing. There are no limits to the tests you can run concurrently and the test(s) are free. Most ESPs offer some type of built-in A/B test. Having a built-in A/B test sections as part of your hosting platform is easier to deal with than setting up a manual A/B test, but it can be done.
Analyze the Results
What the numbers tell you depends, again, on what you decided to test. For example, if your goal is to increase visitors to your website and conversions, you may find that while 10% of your segment B clicked on the link that got them to your landing page, only 15% went deeper and purchased. Your second test resulted in a 25% increase in clicks, but once the subscriber reached the website, they left; logically the second test should have resulted in more conversions, but in this case, it doesn’t.
Why didn’t the numbers work, because email and landing pages go hand-n-hand. Those 25% clicks in the second segment test, to be successful should have taken the visitor to the Call- To -Action your test email prompted. The subscribers, expected, for example, the clickable link to take them to a Lawnmowers Close Out Sales as described in your email, not take them to your home page where they had to search for the sale.
Here are a few best practices to keep in mind when running an email A/B test:
- Use your ESPs built-in tools for easy setup
- Normally the A-list is your control group while B is the variant group
- Run your tests in parallel with each other
- Test, test test, often and early.
- Test one variable at a time, or to test more use multivariate testing (very doable with ListManager)
- Make sure your landing pages match your Call-To-Action
When you test, you can split your list or take a small sampling of addresses, it all depends on the objective or your email campaign. Always track the trackable such as opens and click-throughs. Select the winning test to send to your entire lists and tweak as necessary. And remember A/B email testing should be a vital part of your planning and execution stages, just as important as the content you’re sending
Try ListManager for FREE today! https://mailinglistservices.com/order-2