Email Testing – 3 Greatest Hits from

This guest post is from Justin Rondeau, editor and evangelist

This guest post is from Justin Rondeau, editor and evangelist for

We asked him to share three of his favorite email tests from his site’s research library. Take it away, Justin …

Optimizing your email campaigns is absolutely crucial and something that is made fairly easy by email service providers.

But one of the biggest brick walls I see marketers hit when they are testing is figuring out just what to test.

So I wanted to share the top three tests you could try TODAY on one of your email campaigns.

Following the logical order of how we approach an email campaign, what better place to start than the most common email test: subject lines.

1. Personalization Subject Line Test

Subject lines are one of the easiest tests you can run during your email campaign, but constructing worthwhile subject lines to test is significantly more difficult.

The A/B test I’m sharing today was a personalization subject line test. This is an incredibly simple test to run insofar as you have the personalized information and merge field capability. On top of being easy to construct, it also cuts down on subject line writing time, saving your copywriter from developing multiple subject lines to test.

Let’s look at the two versions:

Version A
Improve Deliverability in Two Simple Steps

Version B
Amanda, Improve Deliverability in Two Simple Steps

And the winner is …

Version B, the personalized version, increased opens by 5.13% and increased CTR by 17.36%.

Why? Personalization can be a gamble, it adds more characters to your subject line and studies have shown that too much personalization can be considered creepy or intrusive. But when personalization is executed well, it can significantly boost response rates.

This is a great test, not only in its design but how it was constructed. Whenever a content based newsletter goes out there is always a variation in email stats based on the actual content of the stories being shared. To counter this variable, AWeber conducted this test for three different email campaigns.

2. Body Content Test

The next thing a recipient will see after the subject line is obviously the body content. Here is an example of a really cool body content test we featured on WhichTestWon a few months back.

Did rich HTML (Version A) or text (Version B) increase sales for this company?

Version A

Version B

And the winner is …

Version B, the text heavy version, beat out the rich HTML version with a 303.8% increase in revenue and a 194.51% increase in traffic to the website.

Though the 10% offer remained the same, the text based version looked far more personal and had the luxury of not having to rely on images being turned on by the recipient. The vast majority of email recipients, especially in the B2B space, have images turned off by default rendering (no pun intended) your hero shot useless.

Lesson: Understand who is consuming your content and how they consume this content. This will help you develop a high impact email content test.

3. Email Landing Page Test

Once recipients open your email and then click through on the links, it’s crucial to have a landing page that will guide them to the final conversion.

Don’t just toss users onto your home page or category page. Create (and test) different email campaign landing pages that are consistent with the subject line and body content. Let’s look at a real example, from an ecommerce website.

Version A

Version B

And the winner is …

Version A, the variation that had a less prominent “Add-to-Cart” box and didn’t spell out the shipping cost for bottles increased sales by 5% and Revenue per Visitor (RPV) by 41%. Visitors to this landing page came from’s “Wine of the Day” email broadcast sent to their in-house list.

Ecommerce marketers working on sites that have heavy items often see a drop off when shipping prices are revealed. This test hypothesized that adding the shipping costs upfront would help avoid sticker shock later in the process.

But being upfront with the customer did not increase sales as originally intended. Glenn Edelman, WineExpress VP of Ecommerce, said the drop in conversions may have been due to moving the Wine Tasting Video below the fold.

What do you think of these tests? Have you run similar ones? What issues and results have you encountered?

Let us know in the comments section!


  1. yawedo

    9/29/2012 4:11 am

    Though the 10% offer remained the same, the text based version looked far more personal and had the luxury of not having to rely on images being turned on by the recipient. The vast majority of email recipients, especially in the B2B space, have images turned off by default rendering (no pun intended) your hero shot useless.

  2. Allan Buckingham

    9/29/2012 8:16 am

    Thanks for sharing some of your test results. These are good things to remember as I start my own e-campaign and advise clients on how best to set up theirs.

  3. Justin Rondeau

    10/1/2012 8:30 am

    @Allan Always happy to share case studies, glad you liked them.

    @Yawedo I totally agree with you about the text based html version vs. the rich HTML version. The winning email felt more like a message due to the introduction of the recipient’s brand new ‘personal relationship manager’. On top of the personalization, the rich HTML version relied too heavily on images which can not only impact the design of your email when images are turned off, but also impact your SPAM score if the ratio of text to images is too low.

  4. Rishi

    10/2/2012 2:51 pm

    Really interesting post. I would not have guessed that version B in case study 2 would win.

    Thanks for sharing this info, Hunter!

  5. Hunter Boyle

    10/2/2012 2:59 pm

    Hey Rishi,

    Thanks for the feedback. Email testing is choc full of surprises, and text vs. HTML is one of the biggest challenges to conventional design wisdom. To put it bluntly, sometimes ugly converts better. (“Ugly” being the term most designers would use for plain text email.) That’s why we have to keep testing — nobody wants to miss out on a 303% revenue increase!

    Check out for more great tests like these every week.

    Cheers — Hunter

  6. Lawrence

    10/2/2012 3:05 pm

    I think he means Version B won the final Landing Page Test. He says Version A. But it is Version B that has the prominent Add to Cart button.

    Or am I missing something?

    Very confused…

  7. Hunter Boyle

    10/2/2012 3:26 pm

    Thanks for the feedback, Lawrence. Version A was the winner, but you’re right that the use of “prominent” in the results could be confusing. It’s definitely smaller than Version B, although it is still above the fold, along with the video, and red, so “less prominent” than Version B should clarify that. Hope that helps. Cheers!

  8. Joe

    10/2/2012 3:43 pm

    This really reinforces the necessity of testing, something I know I should do but don’t. If you had asked me which version of each I thought would be more effective, I would have picked the personalized subject line but would have been dead wrong on the other two.

    300% increase in revenue on one of the tests?!!! The price of not testing sure is high.

    Thanks for publishing these interesting results.

  9. Hunter Boyle

    10/2/2012 4:12 pm

    Thanks, Joe. Glad you found these results informative. There’s never a better time to start testing than right now — it really is easier than many people think, thanks to the many tools available. Or in the case of tests like subject line, or body copy, or form fields and calls to action, your standard tools offer everything you need.

    Keep us posted if you decide to take these ideas and run some tests yourself. We’d like to feature the results. Maybe you’ll get a 300% gain too? Good luck!

  10. Jay

    10/3/2012 3:25 am

    Testing is SO important! I consider myself pretty savvy but got the last 2 wrong. The use of personalization is effective but you have to be liberal – I use a 1 in 3 or 4 occurrence depending on subject line and topic.
    I would have not said the other two would have won so strongly – and I wonder what other factors could have had an influence? For example did the broadcast in example 2 go to a nurtured list, or a list that was cold? Using images definitely adds a degree of appeal for the target as they quickly identify any potential core desire, but the risk in the example with the html being turned off AND no text intro clearly caused a win for the text heavy. The answer is in the lesson – understand audience. I’d like to know how a combination – personalized text and link to website with offer would compare, which just reinforces – once you have the control don’t stop – keep testing testing testing.
    Stunning results in the final test – in terms of revenue. Shipping is another distracting negative ‘consideration’ best left out of the equation until buyers have reached for the wallet and entered the buy phase unless free (which it should be – another test!). I’m surprised that the VP thought the video being below the fold would cause drop off – surely metric would be available to cross check clicks on that medium? Reviews are very influential and important especially in a subjective buy like wine (unless its a repeat offer of a favorite) and they should increase these with incentives. Also partially hide them so that they can track it.
    Again – test test test!
    Awesome examples guys – keep em coming!

  11. Hunter Boyle

    10/3/2012 9:10 am

    @Jay – Lots of great feedback and testing ideas there, thanks! Couldn’t agree more with your mention of understanding the audience. That’s the whole key to the testing cycle — and why it’s so important to make it habitual, not just an occasional thing.

    The WhichTestWon crew does a great job with these, so check them out for a weekly dose of optimization goodness. Cheers!

  12. Randall Magwood

    10/14/2012 3:53 pm

    I can attest to the personalization subject line test. Whenever I test between using a subscriber’s name in my email subject line and body, good things start to happen. Sales increase, subscribers stay subscribed for longer, and of course… my sales increase. People will say that it doesn’t matter if you use personalization – but I disagree with that. I think everyone should use personalization. It just sticks out in a person’s email inbox.