If you’ve never done a split test with your email service before, here are a couple more tips. In part one, we talked about deciding which group jumps out at you — that segment of your email list really must learn more about. We also talked about how to decide what to test, even if you don’t have a lot of preceding market data to fall back on. Continuing on…
Now stand in your customer’s (email reader’s) shoes and make some quick decisions. Trust that he is NOT poring over every word you write or analyzing each picture. (Only email wonks like me do that.) If this is your first A/B test I recommend trying to optimize either your open rate or a click on a specific link; not both, and not a wholes slew of links at one time. You will only defeat the purpose of the test. Remember that this is an A/B test, not an A/B/C/D test.
Let’s start with your open rate, a very common parameter to test:
Your reader may open your email based on a catchy subject line OR because your regular sending history (right?) trained him to expect some bi-weekly information or helpful coupon. Therefore, a common test for open rate is the subject line split test. Obviously the subject line of your email exists to inform and compel your recipient to open. If your intention is to obtain more opens, then try two subject lines.
An aside on subject line testing: As a copywriter, I used to look at split testing as a game. The email manager would ask me for two different subject lines and he would do all the rest. (Back then I would send copy in a Microsoft Word attachment to an email and my job was done; I didn’t do any back room email creation within an email service in those days.) Then he’d let me know which one “won.” We’d try to guess which one would pull better, but in the end it didn’t really matter because he would just send the other subject line to anyone who didn’t open the first one. Not really a split test as much as a back up plan, although we learned a lot about subject lines along the way; and our open rates got better.
Maybe you don’t want to test your subject lines. Maybe you’re concentrating instead on folks who already open most of your emails but for some reason don’t click through to your website. You can put another easy and interesting split test within your email.
Try changing colors of links, or simply the layout of photos, or the size and font of your newsletter headings. If your email service has easy-to-use split testing software, I recommend creating two emails very similar to each other. There should only be one difference between them — a visual cue. Pop them into your email creation software.
Then (I’m not kidding; this works) literally stand back from your computer 5 feet or so and check to make sure they are different enough from each other to notice. Remember that you already know where they are different. The point of this step is to try to become like your unsuspecting reader who is used to receiving your emails. Would she be surprised — and take a second look — if your picture took the place of the usual “Contact me” button? If your most-read articles of the week were highlighted at the top instead of tucked in quietly at the bottom?
Now schedule the sending of both at the same time.
MyTeamConnects email software’s split test automatically sends email A to 20% of the list, email B to another 20%, then a waiting period of 6 hours before the winning email is sent to the remaining 60% of that list. The default ratios can be changed by you of course. Just slide the bar where you want it. Your split test set-up looks like this:
Pretty easy, huh?
- Finally, (this is the fun part) you’ll probably be curious to see the results of your A/B test after you’ve put all that thought into formulating it. Take a look at the data from the final campaign sent. Is it what you expected? Any big surprises; good or bad?
With a smallish list, the results may not be exactly clear, but you can perceive trends over time. Use your data, no matter how insignificant you think it is. document your results in a separate file so that you don’t have to find it within your email service when you’re creating successive email campaigns.
- By the way, take heart if your results are slightly disappointing. You’re never going to slam one out of the park every time. Look at the percentage of success within your test group and then see if that carried over to the remaining group. If the final group’s clicks or opens are higher than the losing test group, then you’re test was a success!
- If not, try to understand why not, and hone your next test based on your best perceptions. Get a second opinion, too. You may be overlooking something obvious, like the time of day the second email group was sent, etc.
And know you’ll get better. Once you get going you can start testing smaller details with a little more confidence.
You, the small biz, really can have more fun marketing your brand than those huge companies that spend hundreds of thousands of dollars on each ad campaign and focus group. Just look at the flexibility you have split testing of your emails in an extremely low-risk and low cost environment. You’ll be amazed at the preferences of the people on your list if you take a couple of hours to create, send and analyze a simple A/B test now and then.
We’d be happy to help if you need some ideas what to test…that’s what we’re here for.
Written by Jen McGahan