7 Split Tests You Can Implement Today

How many times have you heard “you should test to see what works best for you” or something to that effect? Probably too many to count, right? The reason you hear it so often is because when it comes to email marketing (as well as any other marketing channel), testing separates the pros from the Joes.

How many times have you heard “you should test to see what works best for you” or something to that effect? Probably too many to count, right?

The reason you hear it so often is because when it comes to email marketing (as well as any other marketing channel), testing separates the pros from the Joes.

It’s one thing to think we know what works best, but when we apply a little bit of scientific method to our marketing, we not only find out for sure, we learn more about our visitors and subscribers — and that helps us predict more accurately what will work in the future.

The challenge for a lot of people (including us at AWeber) is deciding what to test. There are simply so many small changes we can make to our forms, messages and other parts of our campaigns, that it’s easy to get stuck on deciding where to start.

So, to help you get started with split testing (or to get back into it if you’ve gotten complacent and stopped testing regularly), here are seven split tests you can run on your website to get and retain more subscribers, lower spam complaints, and increase response.

7 Split Tests

Give these a try and see how they affect your subscribers’ response (not to mention your perception of your subscribers).

See our Knowledge Base for instructions on how to split test your sign up form.

  • Create one signup form where you ask for name first, then email, and one where you ask for email first, then name.See if the order that you ask for information affects how many people sign up.
  • Send one broadcast with personalization in the subject line, and one without.Do subscribers respond to personalization, or do they see it as a “gimmick?”
  • Split your next message into three broadcasts with different sending times: one between 8AM and 9AM, one between 12PM and 1PM, and one between 4PM and 5PM. Compare open/click rates for each message.Find out what time of day your subscribers prefer to hear from you.
  • Try using a different call to action on your signup form besides the old classic “Submit.”Come up with 2-3 short phrases, create your forms and compare opt-in rates (a couple options: “Sign Me Up”, “Send Me _____”, “Keep Me Informed”). Keep whatever you’re using now too, and make it the “control” in your experiment.Not everyone wants to submit to getting email from you. Find out what trigger they respond to.
  • Add a privacy statement (i.e. “we will not share your email address…”) to your signup form. Create another form where you instead link to a privacy policy on another page of your site.Compare opt-in rates for those forms against a form where you make no privacy statement.Are visitors more likely to sign up if you tell them you will treat their inbox with respect, and differentiate your email practices from others’?
  • In your next HTML email, test using a button for your call to action against using a text link.Is a well-written text link more compelling than a colorful, more prominent button?
  • For your next broadcast, add a permission reminder (“you’re receiving this email because you signed up at ____” etc) in the message. Compare your clickthrough rates, and your spam complaint rates.Does reminding people why they’re receiving an email make them any more likely to recognize and trust you? Does it make them more likely to read through your email and/or click on links in it?

You might have some interesting findings (if so, please share them!)…

…but even if you don’t — even if you run all 7 of these split tests and none of them bring immediate, significant changes to your campaign — you’ll still be more familiar with split testing than you were yesterday, and better prepared to test and improve your campaigns in the future.

Want More Email Marketing Tips and Ideas Like These?

Join the AWeber Blog newsletter and you’ll get them sent straight to your inbox 1-2 times per week.

You’ll also get a free copy of our Email Deliverability Guide detailing 12 Do’s and Don’ts to get more email delivered.

Naturally, as a permission-based email marketing company, we respect your privacy.

What Do YOU Split Test?

Are there other split tests that you run regularly?

Share them with your fellow email marketers below!


  1. Shirley George Frazier

    6/10/2008 9:00 am

    My split tests focus on subject lines to see which one gets opened more frequently.

    Surprisingly, the subject line I thought would be more popular ended up in a tie with the more-bland subject.

    I stopped testing after this happened each time, but what you suggest in this post is quite interesting. It’s difficult to fit more work into a day, but perhaps I’ll try one of these options in the near future.

  2. Bryan Ellis

    6/10/2008 9:45 am

    Very good advice. I recently ran a split test of only one element – subject line. The click-through rate on the winning headline was exactly 65.9% higher than the click-through rate on the losing headline. It’s amazing what one can learn from split testing.

    Thanks for a great service!

  3. Alcuni Split Test Per L’E-mail Marketing | Marketips.net

    6/11/2008 1:13 am

    […] Tutto è sempre migliorabile, ed oggi parliamo di alcuni split test che puoi fare per quanto riguarda l’e-mail marketing, consigliati da aweber. […]

  4. Aaron Abber

    6/11/2008 7:48 am

    I like the button v. text link idea in HTML. Hadn’t occurred to me. Good point Justin.

    One split test I did was to compare click through rates for my list depending on which pictures I displayed at the main image in my newsletter. My list is about 2/3 male. Over several weeks I split tested identical HTML newsletters–one with a pic of an attractive man as the main image and one with a picture of an attractive woman.

    Over several weeks the CTR for the pic with the woman was 20% higher than the pic of the guy.

    Lesson here: Woman can make men do anything.

  5. Jim Cockrum

    6/11/2008 3:14 pm

    Statistically significant results?

    Is a 40% conversion rate better than a 10% conversion rate? There is NO WAY TO KNOW until you know the SAMPLE SIZE, or number of visitors involved.

    To make use of split testing you MUST know if your results STATISTICALLY SIGNIFICANT enough to make decision.

    In other words, how do you know when can you stop testing and be CONFIDENT that you’ve got some results worth taking action on?

    Sorry this is a bit nerdy, but it’s VERY necessary to make good decisions when split testing. (I was a statistics minor)

    Let’s keep this SIMPLE though – it doesn’t have to be tough.

    Here’s a scenario as an example:

    If option A gets a 23% conversion after 100 visitors and option B gets a SEEMINGLY better 30% conversion after 100 visitors is there really a significant difference between the two options that warrants dropping option A and committing to Option B?

    Probably not. You need more testing.

    You can quickly find out if your testing is producing significant differences using a free SIMPLE tool like this one:


    Make CONFIDENT decisions by understanding and using a simple tool that identifies the STATISTICALLY SIGNIFICANCE of your results.

  6. Justin Premick

    6/11/2008 4:02 pm


    Statistical significance does matter, but as you say, it’s best to keep things simple.

    Start worrying too much about p-values, confidence intervals, etc. and you run the risk of just abandoning testing entirely and keeping your controls forever, because that takes less effort than testing.

    Personally, I like the way Paul Myers broke down testing in a newsletter he sent a couple months ago:


    Change or add or remove something.

    Count again.

    More or less?

    More? Keep it.

    Less? Go back to the original and try something else.

    For people who haven’t done a lot of testing yet, that’s a great way to look at it. Once you start getting into more fine-tuned testing, I totally agree – statistical significance is critical.

    Thanks for the reminder about the value of stats — and that tool looks quite useful… thanks for sharing!

  7. Doug Barger

    6/11/2008 7:56 pm

    Awesome post Justin!

    Really great ideas.

    I especially like the "not everyone wants to *submit* to receive your mesages. Lol!

    Makes me imagine testing the use of *Surrender* on the button to see how that would convert.

    But seriously, very worthwhile and profitable information.

    Way to go man and have a great one.

  8. Doug Barger

    6/11/2008 8:00 pm

    Great. It looks like I misspelled the tongue twister, "messages"

    and left out subjects throughout in the name of being more conversational.

    In fact the only sentence containing the proper subject and predicate usage is the one beginning with the word, "I" which supposedly is not good to use either because it’s too much about the one writing and not about the reader.

    Oh well, there is time to improve and the lesson to be learned from my mistake:

    Proofread your comments before clicking *surrender* I mean, submit.

    Respectfully submitted at your service

  9. Scot McKay

    6/12/2008 2:01 am

    For us also, the most compelling split tests have involved the subject line.

    We’re in the dating advice niche, so your mileage may vary, but we split tested the combo of a descriptive subject, a direct question (more inductive) and an intriguing quote lifted directly from the text.

    The time-staggered broadcast that you mentioned brought us very interesting results, including some counter-intuitive trends. We’ve also split-tested which days to send on.

    Others we’ve done:

    1) Text vs. Lite HTML

    2) "In This Issue" teaser vs. none

    3) Rearranging links/subheadings within an e-mail (the ones at the top and the very bottom of the unique content get clicked on the most)

    4) ALL CAP SUBJECTS, vs. Natural sentence structure… vs. Capitalized Standard Title Format

    5) Length of copy

    6) Single offer vs. adding a secondary P.S. and/or front-end teaser offer

    7) Using blatant spam filter shields (i.e "f.ree") vs. changing the syntax altogether

    8) Frequency of e-mails

  10. Rich

    6/12/2008 6:54 am

    You can test the addition of a personal photo on your squeeze page vs just a name and signature.

    Photos generally increase conversion rates but…. not necessarily!

    Always test =)

  11. John Rodriguez

    6/12/2008 2:50 pm

    Hi Justin,

    Some great split-test ideas in this post, though there’s a bit of an issue with the last suggestion:

    "For your next broadcast, add a permission reminder (

  12. Paul Broni

    6/12/2008 4:22 pm

    We also test the sender (both the alias and the actual email address).

    After all, the most challenging part of email marketing is getting the message opened, and there are only two things for a recipient to consider when deciding to open or delete: the sender and the subject line.

  13. Justin Premick

    6/12/2008 5:15 pm


    I’m a fan of the "In This Issue" teaser – just haven’t run enough tests on it yet to decide if I’m going to use it regularly. How’d it work out for you?


    Yeah, SpamAssassin does that sometimes – but I don’t let it bug me. Like you say, unless your score is already high, that little bit isn’t going to be an issue.


    Great call – this is one of the concepts we teach in the How to Get Started webinar. Choose an effective "from" line. Definitely worth testing going forward, too.

  14. Adam Z

    6/15/2008 2:45 am

    Great info for a newbie like I – thanks all.

    May I ask how do you split test a subject line? Do you send half of the list 1 subject, the remainder another? If so, how can that be an accurate test as the campaign is being delivered to 2 isolated groups?

    Still learning…

  15. Justin Premick

    6/16/2008 10:25 am


    There are a couple ways you can do it that may be appropriate for different situations.

    One common way is to create 3 groups – divided into say, 20%/20%/60% of your subscribers. Send out your split test to the 2 smaller groups, then once you’ve seen which version of your email pulls better, send that version to the largest group.

    This is useful when you’re trying to maximize response to one particular email (such as a sales promotion).

    Another way to split test is to do as you say – simply group your subscribers into even groups and send each version.

    This has the advantage of giving you more data points in your test, but does not let you "do something with that data" as quickly as the first split test method does. This second method is more useful for learning about your subscribers’ general tendencies and preferences. (For example, this is what we did in our "button vs. text link" split tests.)

    I’m not quite sure what you mean by your question about isolated groups – can you clarify that?

  16. Adam Z

    6/17/2008 7:44 am

    Thanks for the clarification Justin, that makes sense.

    Re my isolated groups, I know that a particular bunch of my subscribers are avoid readers/ clickers…so if the split test went to that group (of highly active open/ clicking people) and the other list is less active, then are the split test results a real indication.

    In other words, some people will almost always open my mail, so if they are subject to list A or list B then my results will be skewed. Does that make any sense 😕

  17. Jim Labadie

    6/17/2008 10:19 am

    Great stuff! I especially love Aaron’s post about the picture of the woman. That will definitely apply to our list. We pretty much always use text emails, but we’ll need to test some HTML.


  18. Aaron Abber

    6/17/2008 5:56 pm



    This week we feature marketing babe Robin Meade of headline news.

    (This pic might be a bit too extreme…I don’t know.)


  19. Miguel GT

    6/18/2008 3:50 am

    I Knew I had to do split tests but really didn’t know what to test first. Good article for beginners.

  20. Brian

    6/18/2008 11:36 pm

    This is a great checklist and action plan. My biggest problem is not deciding what to split-test but rather getting organized to actually conduct the test! I’ve summarized the 7 points and posted it on my wall to remind me to actually start split testing! Thanks for the post!

  21. AWeber User Gets 1000% Increase In Opt-Ins Using Popover Form - Email Marketing Tips by AWeber

    7/3/2008 9:11 am

    […] makes me sad, because it shows an unwillingness to split test, which as I discussed recently is key to running effective email marketing […]

  22. Social Proof Tool Boosts Landing Page Conversion 32.4% - Email Marketing Tips by AWeber

    7/25/2008 12:20 pm

    […] start today! To help, here are some split test ideas you can use for your email […]

  23. Alan Boyer

    7/29/2008 9:17 am

    Thanks Justin. Split testing and constant upgrading of emails, articles, white papers all will increase overall results by many times. The first few times I tested things I saw an increase of 5 to 10 times my original response rates.

    One thing that has had a big impact, is testing the subject line of an email, or the title of an article or white paper.

    The subject line determines what percentage of the time that the email gets opened. If it doesn’t get opened nothing else matters, no matter how well you do it, or how valuable what you have to sell.

    You’ve got to get ATTENTION in the subject line to get people to stop and read the rest.

    The subject lines that get the most opens are the ones that get the most attention and tend to follow the following:
    1) It mention PAIN that your reader is experiencing.
    2) It adds the VALUE they will get from reading further
    3) It’s got to be perceived as something that will HELP them remove a problem, and move toward great valuable results.

    What not to do:
    1) Don’t just give a generic title
    2) Don’t sell something
    3) Don’t just name something that is in the article.

  24. Email Testing Inspiration: Tips and a Recent Example - Inbox Ideas: Email Marketing Tips by AWeber

    4/6/2009 2:56 pm

    […] big gains that tests can bring you. Perhaps you’ve even looked over some of the examples of things you can split test and considered doing […]

  25. Ash

    7/24/2009 12:00 pm

    Whoa! I totally forgot about this. Where have I been? I just recently (2 days ago) set up my own opt-in form (I usually do it for my clients) for my free e-book. I totally forgot about trying different options. I guess I would have gotten around to it some day, but I am glad this e-mail came to me today. I have a few hours to kill. I’m going for it.

    Thanks for the great articles.

  26. PA

    11/23/2010 4:00 pm

    What time line do you suggest for split tests before measuring results?

    I would’ve thought that 1 month would be ok, but then again I’ve seen that certain products that I promote can have a 20% different in monthly sales so it would be hard to measure minute changes if the same thing applies to email opt in rates.

  27. Brian M Connole

    12/12/2010 1:02 am

    My split tests never seem to work out that well. I will have to try some of your suggestions…

    Thanks a lot for your article, I think it will help me out in my own testing.

  28. Jack Fisher

    1/8/2011 2:55 pm

    Always great to learn from the gurus !!
    Thank you

  29. Liat

    1/22/2011 2:50 pm

    Thank you for these practical and easy-to-implement suggestions! I’ll get started!

  30. Email Advice…From Lady Gaga?

    2/10/2011 9:28 am

    […] split test and find out what truly works for you. Don’t know what to test? Here are a few split test ideas to get you […]

  31. Amanda

    3/29/2011 7:30 am

    Have you looked at a morning drop vs an evening drop? What did you find?

  32. Bonnie

    7/15/2014 8:29 pm

    Hi there, I am just tweaking my autoresponder series and wanted to conduct a split test. How can I split test an email on my AR series? Can you help me Justin?

    Thanks in Advance,


  33. Brandon Olson

    12/11/2015 10:06 am

    Hi Bonnie. Thanks for your question. Previously, this was not possible. However, with our new email automation platform, Campaigns, you can split test your follow up series. Contact our Customer Solutions team for assistance.