Do Buttons Get Clicked More Than Text Links?

Many of our readers have already signed up to the live seminar on split testing that we announced last week.

But even if you can’t make it, you’re probably interested in learning more about split testing now, right?

Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.

Today, let’s look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.

The Test

Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.

One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”

Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.

So, Marc created a button-shaped image with the words “Read More” stamped on it:

We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.

The emails were otherwise identical ? we kept subject lines, sending dates/times and templates the same for each version.

Measuring Our Success

Clicks-to-Opens: clicks divided by opens.More email statistics.

Since we’re trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.

We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.

That way, if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s call to action look more effective than it really was.

Our Expectations

Going into this test, we expected the button to beat the text links handily.

Why?

It was physically larger than the text link
It contained a clear call to action ? “Read More” ? while a contextual link might be less obvious.
It was an image placed in a part of the email where readers hadn’t previously been shown images

Basically, we expected the button would grab people’s attention as they scanned through the email.

On the flipside, we knew that readers might have images disabled and wouldn’t see the button.

So we added the ALT text “Read More” to the button image.

Since with images disabled the text “Read More” would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.

Initial Results

In our first split test, the button call to action outperformed our text link by 51.4%.

We started running the test on our Blog Broadcasts last year.

As we expected, the button grabbed readers’ attention and incented them to click through, much better than the text link did.

Clicks-to-opens for the button was repeatedly higher ? a lot higher ? than it was for the text link.

In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.

At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”

…But, We Kept Going!

We could have stopped after those first few tests ? and in many cases, one-time or short-time split tests are appropriate.

However, even in our initial few tests, the text had beaten the button once, and by a large margin.

I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the “novelty” of it.

So we continued to split test our Blog Broadcasts…

Later Results

Further testing showed that using buttons instead of text was not a good long-run tactic.

We ultimately ran the button-versus-text split test about 40 times, over the course of several months.

For a while, the button continued to beat the text links ? but we noticed that it wasn’t doing so by as large a margin as it first had.

While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.

Final Outcome

With each new split test, the text asserted itself as the better call to action.

By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.

Conclusions: What Happened?

The button is effective in the short run, but after a while readers become “numb” to it and no longer respond at the same initial high rate.

Consider the following two stats from our split tests:

 

That first stat doesn’t seem important in and of itself ? 53% is barely over half the time.

However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true ? the button just stopped “working.”

Which brings us to conclusion #2:

Test results are not forever.

What works today may not work tomorrow.

Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.

It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.

Finally, one last point I feel obligated to make:

What works for someone else may not work for you.

The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.

Buttons may work well for you in the short run. Split test them.

We send our blog newsletter often ? 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.

If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.

Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.

Again, don’t just take our word for it. Find out for yourself through your own testing.

What Do You Think?

What are your thoughts on this case study?

Have you tested calls to action or other elements of your email newsletters? What were your findings?

Can you think of any other factors that may have influenced our results?

Share your reactions below!

By:
Justin Premick is the former Director of Educational Products at AWeber.

Become a Better Email Marketer

Subscribe to This Blog by Email
Why Subscribe?

46 Comments

  1. Very interesting…Thank you for posting the results of the split testing. I appreciate that. The color used for the button may have some effect on its results. The words used to call for the action also may have had some effect on the results. I think you are right that a person exposed to the same "button" would become numb to it after a while. I suspect that if you varied the words and the colors on the button the effect it has would be sustained longer.

    Good stuff people, keep it coming.

    3/25/2008 8:19 pm
  2. Chris

    Great post. Would like to see more content on this topic (split testing).

    3/25/2008 8:19 pm
  3. Great post Justin,

    One question, were your text links tracked using your aweber link tracking or using Google Analytics, ie, with analytics tags on the end of the blog url?

    3/26/2008 5:17 am
  4. Dustan,

    I agree – there are definitely some other things we could try to sustain the positive effects of using a button over text links. But I think the overall results, over a long enough time period, would be the same for us – eventually people would become numb to the button and/or resent it.

    Graham,

    We used our own click tracking for gathering and comparing clickthroughs. While we do also track activity on email clickthroughs with Google Analytics (to see what content people clicking through from emails are interested in, for example, or to see what proportion of traffic to a page was due to email), it was more convenient to compile our data in AWeber, and we didn’t see any compelling reason to use GA to record our test results.

    3/26/2008 8:35 am | Follow me on Twitter
  5. As I was reading this I was alarmed the button was outperforming the tests. My own testing has always had a text link getting the best clickthrough. But I settled down after the long term results came out in favor of the text link.

    The surprise here is the patience it must have taken to allow the text link to continue to run instead of attempting to beat the (apparently) new control.

    3/26/2008 3:50 pm
  6. Mitch,

    It was definitely tempting to switch – though I think that since our goal was getting more people to read the blog, rather than say driving them to an order link, it was easier to restrain myself and let the experiment play out :)

    3/27/2008 8:09 am | Follow me on Twitter
  7. Fascinating stuff! Particulary like seeing the value of retesting through time. Thanks for this.

    Wonder what results you might have got with an HTML button (not sure what the right terminology is). Instead of an image, using table cells, background colors and fonts to create the illusion of a button (but one which would not be affected by image blocking).

    Perhaps worth another test of its own? :-)

    3/27/2008 8:27 am
  8. I’m really new at this and the info you are putting out is very helpful. I have one question about this test. I have decided in my mailings to use text only because I don’t want the html picked up by spam filters. So, would a short snipet of code for this button have a large impact on the spam filters??

    3/27/2008 9:04 am
  9. Your testing just taught me a new sales tool, though. I use text in my broadcasts. I offer ebooks to my customers. I can see where using a button for a new product is worth the effort, then as the product ages, convert it to text. The customer will become used to seeing the button for new products. I love this!

    3/27/2008 12:00 pm
  10. Very interested test. I would have thought the button would have won out every time. I believe it will still be an effective tool if used in moderation only when you want to draw attention to something very special. It will be something I try in the future.

    Thanks

    3/27/2008 12:04 pm
  11. Hey
    Thanks this is some great info. I thought the button would have won hands down also. But as we can see from you test we need to test before we draw our conclusion.
    Thanks for this

    3/27/2008 12:12 pm
  12. Thanks for posting this interesting study Justin.

    I especially appreciate your scientific methodology and persistence in collecting data over time. Too often people take the facts and figures that support their position and run with it. It would seem to me that in this case it might be good to change our approach from time to time so people don’t get numb to what we are doing.

    Keep up the good work.

    3/27/2008 12:26 pm
  13. This is an interesting illustration of human psychology: we tend to ignore things we are used to.

    I think the challenge (and the reason we are never finished testing) is to determine what should be changed and how.

    3/27/2008 1:14 pm
  14. Adam Jones

    G’day Justin,

    Thanks for the case study, it really does show how to split test.

    My question relates to the actual definition of "test".

    Your study mentions that you ran the test over 20 times. How do you define a "test"? Is it a period of time, a number of opens, or a single broadcast?

    It is an interesting dimension to split testing that I had not thought about before.

    3/27/2008 1:40 pm
  15. Mark,

    Good thought – image blocking could definitely impact our results, especially at the beginning before people caught on that we were using images as the call to action.

    We’re currently doing some other testing, but I wouldn’t count out the possibility of bringing back the button-text test in another format (or on another list)… :)

    Connie,

    I’m not sure what you mean. Are you sending plain text-only emails? Or do you mean that you’re sending HTML emails that don’t have much styling (i.e. they look like plain text emails)?

    If you’re actually sending plain text-only emails, you won’t be able to add a button to those (that can only be done with HTML, so you’d have to start sending in HTML).

    As for email deliverability, I wouldn’t be concerned. Sending HTML in and of itself does not get you filtered – it’s about how your readers respond to your messages (do they mark them as spam?) that primarily determines if you get filtered. As long as your HTML emails are requested, relevant and valuable you’ll do just fine.

    Larry,

    I tend to agree with you – we saw big gains in our click throughs at first using the button, and a lot of email marketers use them in promotional messages. The lesson I learned from this was not to use them so often that they stopped "working."

    Adam,

    When I refer to a "test" I’m talking about one broadcast split test. So the ~40 tests we ran are actually ~40 individual split test broadcasts that we sent to our blog’s email subscribers.

    3/27/2008 2:00 pm | Follow me on Twitter
  16. Good Job;

    I do question the crowd the email was sent to though. If to newbies and mostly buyers, then the Button might be more of a "Curiosity" item. An on the other side, if the emails were by and large experienced marketers, the Button would seen more unfamiliar and for the untrained than the experienced.

    To the marketers, the link is more amiable and familiar, which bring up the question, Do we need to have different psycology for new "Opt In’s" than the experienced ones that have been on a list for a long time?

    More "Split Testing" – eh???

    Keep up the good work though. We need to think and work about items outside of our own box, so to speak.

    3/27/2008 4:14 pm
  17. Hi,

    Thanks for a great post. It got me to thinking that I’ll test my links for longer and watch more closely the impact.

    It’s interesting that the greatest click throughs I get are those where I’ve posted pictures of my family events (nothing to do with the newsletter per se, just me humanizing me for my readers) and to free self-assessments I do for people each month. I’m now thinking that maybe buttons on those links that I have seriously monetized, may entice more clicks. Definitely something for me to test!

    Thanks again for pushing me to think a bit more laterally about my testing.

    3/27/2008 4:54 pm
  18. Baba Gaye

    Hi
    I wanted to register for your free one-hour seminar on Split Testing Emails and Web Forms.I missed it.
    Please, I’m very intersted, so let me know for scuh events next time.

    3/28/2008 11:45 am
  19. Nice case study…

    I’ve performed very similar studies and found that there’s a lot of variables at play depending on what market you’re using.

    Also, there were different results in different markets depending on the use of serif fonts or sans-serif fonts.

    As you said though, the best way to figure out what works is to test.

    3/28/2008 2:32 pm
  20. Hi Justin,

    Thank you and Marc for sharing your results with us.

    Those that didn’t read to the conclusion of the article may have thought that the buttons won hands down.

    I think it shows that you care about the success of your subscribers and the test results being shared is a nice gesture and generous content.

    Thank you!

    3/28/2008 4:58 pm
  21. I am curious just what studies are done and what they are looking for.I am excited to know what makes them happy with the input that is given by each person thanks for your time kathleen

    3/30/2008 1:19 am
  22. Justin – this is a superb test. Many thanks for the results.

    The findings are very interesting and it is unlikely anyone could have predicted them!

    Brilliant – thank you!

    4/15/2008 12:18 pm
  23. shrestha12

    Buttons look fancier and more attractive than old plain Jane links.It’s more fun and stylish so it’s more clickable.

    10/19/2008 3:30 am
  24. Very interesting.

    I also find textlinks the better performers.

    Especially if the location any match with content is logical.

    This way better results for affiliate-marketing.

    6/9/2009 10:14 am
  25. O

    Thanks for posting the split testing method and outcome. We have created several sites for various promotional product companies and they all seem very adamant about having large buttons that blink or have movement to catch the visitors eye. Personally, I like small non obtrusive buttons or text links. having overly large buttons or text is like typing in all caps or reading a childrens book.
    However they insist on it and can vouch for it by their pay per click campaigns. We can only offer advise and in the end give them waht they ask for to the best of our ability.

    8/10/2009 3:36 pm
  26. Those are interesting results. When the multiple tests were run, did the button size/shape/color change, or was it constant? Did the text link location and/or wording change during those successive runs?

    1/28/2010 6:17 pm
  27. Very interesting test,
    now maybe a lesson on how to create buttons?

    2/15/2010 3:08 am
  28. My experience is that when using a new button, it works very good, but after an while the ctr drops. With tekstlinks, it stays more at the same level. That’s why I like to use more tekstlinks.

    2/18/2010 3:06 pm
  29. In my experience, textlinks work better on affiliate websites. CTR is about the same as buttons, but the quality is way higher.

    4/11/2010 8:22 am
  30. This is a very informative post and thanks for highlighting what we can do with split testing. Isn’t it interesting that we cannot trust what we see? Lesson learned: Need to continue to test and monitor.

    That does seem to imply that we need to introduce new things to avoid people from getting numb.

    6/29/2010 11:22 am
  31. Very useful info. Thanks as always!

    9/12/2010 6:29 pm
  32. Testing is obviously very important. It is amazing the results you have found in this test. I would have picked the other button.

    10/27/2010 7:20 am
  33. Testing is awesome. splittesting can really help your site get better conversion.

    10/27/2010 9:58 pm
  34. I woke up this morning determined to listen to what my market has been asking for – and I haven’t been paying attention. Your article will help me take the next step – to adapt my branding and jump start a fresh approach to supporting and attracting readers. I’ll report back on what happened after a few weeks of concentrated effort! Thank you again for all the great info on your site. I’ve used a number of email marketing services in the past, but have fallen in love with you folks!

    11/25/2010 8:48 am
  35. I found it interesting that you said the people became desensitized to the image over time. It is amazing to me what a big difference the smallest changes can make. I would lie to learn more about how I could do slip-testing for various feature for my website. Thanks for sharing this information.

    3/1/2011 7:14 pm
  36. Elizabeth,

    I think there’s a lot to be said for periodic testing and updating of elements of your email campaigns, as well as other marketing assets.

    I would not at all be surprised to find a similar desensitization among people viewing ads on your website, viewing ads you are running to get people to your website, or even reading your subject lines (if you are regularly including the same things in your subjects – name personalization, your company name, ALL CAPS, and so on). There’s something to be said for giving your content a refresh now and then.

    3/2/2011 11:06 am | Follow me on Twitter
  37. Everyone. Buttons, split testing, etc. Always remember that readers want good content so don’t get carried away with all this testing if it causes your content quality and quantity to drop. Google searches is where we get our visitors from.

    4/11/2012 11:30 am
  38. Kasim

    The test tell me that it’s a novelty effect. If the button out-performs for 2 weeks, then I would use it for 2 weeks a year.

    These results remind of what happens to the take up rate of a product during and after a launch – you get a spike during a launch which then quickly comes down to normal levels.

    So, you have to keep launching on a regular basis giving people time to forget the previous launch.

    9/5/2012 9:12 am