Make testing work

Email marketers know that it takes data to determine just the right timing, cadence, offer, and design.

Important marketing questions can be answered by gut feel, but that usually gives the fastest – rather than best – answer. Just look at WhichTestWon’s Test of the Week. More often than not, my gut is wrong. What about you?

The challenge always comes where the rubber hits the road: Day in and day out, how do I as a marketing leader get the data I need? When I speak with clients, they often talk about how difficult it is to test when there are so many daily fires to put out. As an industry and a service provider, the message is that we need to produce tools and approaches that make testing easier to incorporate into the daily routine. But what is the takeaway for a brand marketer?

A recent experience working with a non-profit client can help illuminate. The question: Should the email donation button be blue or brown?

  • Brown blended with the background nicely; but does that mean it would be lost to recipients?
  • Blue popped against the background; but was it too much?
EmailTestbox code to switch images and track clicks

EmailTestbox code to switch images and track clicks

Conventionally, the client would have had to hold out a group, conduct the A/B test, then send the balance of the campaign based on initial results. However, using EmailTestbox, the client had the opportunity to send out the campaign all at once; two small pieces of HTML code would do the work of switching out the button images and collecting response. When statistically significant, EmailTestbox would then show the winner to all subsequent openers.

The result? Brown wins! Recipients who rendered the brown button were ~50% more likely to click to donate than a random recipient, and ~200% more likely to click to donate than a recipient who rendered the blue button

Because the test was in real-time, this statistically significant result was captured within hours of the send, ensuring that all subsequent openers saw the brown button.

The lesson for brand marketers?

  1. Minor changes matter. The only change between the two creative versions was the button color, but it produced dramatically different results. Start small, and iterate.
  2. Apply technology where it makes sense. This was a simple A/B test, so using a real-time testing application like EmailTestbox may have been more firepower than needed. However, it also prevented additional work and setup on the campaign side. If your testing goals can be accomplished low-tech, by all means go for it! When you have more complex testing needs (e.g., multivariate), bring technology to bear.

Recent Posts

Email Delivery 101 – Hard Bounces & Soft Bounces Email delivery can be a tricky business. When your email message doesn’t land in either the inbox or the spam folder, we say it bounces. Sometimes the bounce is temporary (a “soft bounce”) Other times, the failure is permanent (a “hard bounce”) Bounces are a fact […]

A Fresh Review of B2B Email Verification Providers By Chris DeMartine, Managing Director at Programmatic B2B, LLC Email address quality has never been more important than it is today. With ISPs and companies continuing to tighten the reins on which email communications get delivered and which get discarded, B2B marketers face difficult and complex challenges. […]
Chat with us