Testing is a popular topic here at FreshAddress, simply because we’ve seen so many clients benefit from using tests to optimize their email campaigns (see this recent post!).
Most email marketers think of A/B (split) tests when faced by a choice of subject line, button color, or headline, but it’s not the only approach available. Let’s take a look at the pros and cons of the three most popular testing types: A/B, A/B/C and multivariate testing.
Take your proposed campaign email and create a second version which differs in some way from the original, perhaps it features a different subject line.
Then, create two test groups of email addresses from your email list: one group will receive Version A and one will receive Version B. The subject line that produces the best results is then sent to the rest of the list. Easy! You can also keep those results in mind when crafting subject lines for future campaigns.
The strength of A/B testing is its simplicity. The weakness is that you can only compare two email versions. Suppose, for example, you have four subject lines you want to compare. No problem…that’s where A/B/C testing comes in.
A/B/C testing is just an extended version of A/B testing. Instead of two versions of your email, you have 3, 4, or 5 versions to let you test more variations of the particular element you’re investigating.
The downside? You need more test groups. If you have a small list, it can be a challenge to gather enough email addresses in your test groups to ensure the differences you see are not simply down to chance.
Of course, you don’t have to test just one element in A/B or A/B/C testing. You could give Version B a different headline, subject line, preheader, and button color. You’d find a winning version, which is great when deciding which version is best for that current campaign. However, you wouldn’t learn much to apply to future campaigns. If Version B wins, you don’t know how each individual change impacted results. Was it the new subject line or alternative headline that made the difference? Help is at hand with multivariate testing.
Each test group receives a different combination of components in multivariate testing. Group 1 might see Version A’s subject line, header, and button. Group 2 sees Version A’s header and button, but Version B’s subject line. Group 3 sees another combination, and so on. The results provide more insight into the specific impact of individual elements and what the best combination of components is. The challenge is that you need even more test groups and a good tool or expert help to ensure you set up and evaluate the tests correctly.
In summary, A/B or A/B/C testing is fine if you’re just testing one element or comparing different email versions prior to sending out a campaign. However, if you want to learn about the impact of each of several changes to an email, then multivariate is the way to go!