Do Buttons Get Clicked More Than Text Links?
By Justin Premick March 25, 2008
Many of our readers have already signed up to the live seminar on split testing that we announced last week.
But even if you can’t make it, you’re probably interested in learning more about split testing now, right?
Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.
Today, let’s look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.
The Test
Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.
One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”
Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.
So, Marc created a button-shaped image with the words “Read More” stamped on it:
We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.
The emails were otherwise identical ? we kept subject lines, sending dates/times and templates the same for each version.
Measuring Our Success
Clicks-to-Opens: clicks divided by opens.More email statistics.
Since we’re trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.
We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.
That way, if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s call to action look more effective than it really was.
Our Expectations
Going into this test, we expected the button to beat the text links handily.
Why?
Basically, we expected the button would grab people’s attention as they scanned through the email.
On the flipside, we knew that readers might have images disabled and wouldn’t see the button.
So we added the ALT text “Read More” to the button image.
Since with images disabled the text “Read More” would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.
Initial Results
We started running the test on our Blog Broadcasts last year.
As we expected, the button grabbed readers’ attention and incented them to click through, much better than the text link did.
Clicks-to-opens for the button was repeatedly higher ? a lot higher ? than it was for the text link.
In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.
At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”
…But, We Kept Going!
We could have stopped after those first few tests ? and in many cases, one-time or short-time split tests are appropriate.
However, even in our initial few tests, the text had beaten the button once, and by a large margin.
I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the “novelty” of it.
So we continued to split test our Blog Broadcasts…
Later Results
We ultimately ran the button-versus-text split test about 40 times, over the course of several months.
For a while, the button continued to beat the text links ? but we noticed that it wasn’t doing so by as large a margin as it first had.
While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.
Final Outcome
With each new split test, the text asserted itself as the better call to action.
By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.
Conclusions: What Happened?
Consider the following two stats from our split tests:
- Overall, text links outperformed buttons 53% of the time.
- After an initial period where the button was more effective, text links outperformed buttons 67% of the time.
That first stat doesn’t seem important in and of itself ? 53% is barely over half the time.
However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true ? the button just stopped “working.”
Which brings us to conclusion #2:
What works today may not work tomorrow.
Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.
It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.
Finally, one last point I feel obligated to make:
The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.
We send our blog newsletter often ? 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.
If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.
Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.
Again, don’t just take our word for it. Find out for yourself through your own testing.
What Do You Think?
What are your thoughts on this case study?
Have you tested calls to action or other elements of your email newsletters? What were your findings?
Can you think of any other factors that may have influenced our results?
Share your reactions below!
Dustan Harless
3/25/2008 8:19 pmVery interesting…Thank you for posting the results of the split testing. I appreciate that. The color used for the button may have some effect on its results. The words used to call for the action also may have had some effect on the results. I think you are right that a person exposed to the same "button" would become numb to it after a while. I suspect that if you varied the words and the colors on the button the effect it has would be sustained longer.
Good stuff people, keep it coming.
Chris
3/25/2008 8:19 pmGreat post. Would like to see more content on this topic (split testing).
Graham Cox
3/26/2008 5:17 amGreat post Justin,
One question, were your text links tracked using your aweber link tracking or using Google Analytics, ie, with analytics tags on the end of the blog url?
Justin Premick
3/26/2008 8:35 amDustan,
I agree – there are definitely some other things we could try to sustain the positive effects of using a button over text links. But I think the overall results, over a long enough time period, would be the same for us – eventually people would become numb to the button and/or resent it.
Graham,
We used our own click tracking for gathering and comparing clickthroughs. While we do also track activity on email clickthroughs with Google Analytics (to see what content people clicking through from emails are interested in, for example, or to see what proportion of traffic to a page was due to email), it was more convenient to compile our data in AWeber, and we didn’t see any compelling reason to use GA to record our test results.
Mitch Tarr
3/26/2008 3:50 pmAs I was reading this I was alarmed the button was outperforming the tests. My own testing has always had a text link getting the best clickthrough. But I settled down after the long term results came out in favor of the text link.
The surprise here is the patience it must have taken to allow the text link to continue to run instead of attempting to beat the (apparently) new control.
Justin Premick
3/27/2008 8:09 amMitch,
It was definitely tempting to switch – though I think that since our goal was getting more people to read the blog, rather than say driving them to an order link, it was easier to restrain myself and let the experiment play out 🙂
Mark Brownlow
3/27/2008 8:27 amFascinating stuff! Particulary like seeing the value of retesting through time. Thanks for this.
Wonder what results you might have got with an HTML button (not sure what the right terminology is). Instead of an image, using table cells, background colors and fonts to create the illusion of a button (but one which would not be affected by image blocking).
Perhaps worth another test of its own? 🙂
Connie Sanders
3/27/2008 9:04 amI’m really new at this and the info you are putting out is very helpful. I have one question about this test. I have decided in my mailings to use text only because I don’t want the html picked up by spam filters. So, would a short snipet of code for this button have a large impact on the spam filters??
Hope Clark
3/27/2008 12:00 pmYour testing just taught me a new sales tool, though. I use text in my broadcasts. I offer ebooks to my customers. I can see where using a button for a new product is worth the effort, then as the product ages, convert it to text. The customer will become used to seeing the button for new products. I love this!
Larry Buhrandt
3/27/2008 12:04 pmVery interested test. I would have thought the button would have won out every time. I believe it will still be an effective tool if used in moderation only when you want to draw attention to something very special. It will be something I try in the future.
Thanks
Coach
3/27/2008 12:12 pmHey
Thanks this is some great info. I thought the button would have won hands down also. But as we can see from you test we need to test before we draw our conclusion.
Thanks for this
Ken Wolff
3/27/2008 12:26 pmThanks for posting this interesting study Justin.
I especially appreciate your scientific methodology and persistence in collecting data over time. Too often people take the facts and figures that support their position and run with it. It would seem to me that in this case it might be good to change our approach from time to time so people don’t get numb to what we are doing.
Keep up the good work.
Andre Vatke
3/27/2008 1:14 pmThis is an interesting illustration of human psychology: we tend to ignore things we are used to.
I think the challenge (and the reason we are never finished testing) is to determine what should be changed and how.
Adam Jones
3/27/2008 1:40 pmG’day Justin,
Thanks for the case study, it really does show how to split test.
My question relates to the actual definition of "test".
Your study mentions that you ran the test over 20 times. How do you define a "test"? Is it a period of time, a number of opens, or a single broadcast?
It is an interesting dimension to split testing that I had not thought about before.
Justin Premick
3/27/2008 2:00 pmMark,
Good thought – image blocking could definitely impact our results, especially at the beginning before people caught on that we were using images as the call to action.
We’re currently doing some other testing, but I wouldn’t count out the possibility of bringing back the button-text test in another format (or on another list)… 🙂
Connie,
I’m not sure what you mean. Are you sending plain text-only emails? Or do you mean that you’re sending HTML emails that don’t have much styling (i.e. they look like plain text emails)?
If you’re actually sending plain text-only emails, you won’t be able to add a button to those (that can only be done with HTML, so you’d have to start sending in HTML).
As for email deliverability, I wouldn’t be concerned. Sending HTML in and of itself does not get you filtered – it’s about how your readers respond to your messages (do they mark them as spam?) that primarily determines if you get filtered. As long as your HTML emails are requested, relevant and valuable you’ll do just fine.
Larry,
I tend to agree with you – we saw big gains in our click throughs at first using the button, and a lot of email marketers use them in promotional messages. The lesson I learned from this was not to use them so often that they stopped "working."
Adam,
When I refer to a "test" I’m talking about one broadcast split test. So the ~40 tests we ran are actually ~40 individual split test broadcasts that we sent to our blog’s email subscribers.
Mike
3/27/2008 4:14 pmGood Job;
I do question the crowd the email was sent to though. If to newbies and mostly buyers, then the Button might be more of a "Curiosity" item. An on the other side, if the emails were by and large experienced marketers, the Button would seen more unfamiliar and for the untrained than the experienced.
To the marketers, the link is more amiable and familiar, which bring up the question, Do we need to have different psycology for new "Opt In’s" than the experienced ones that have been on a list for a long time?
More "Split Testing" – eh???
Keep up the good work though. We need to think and work about items outside of our own box, so to speak.
Shelley Holmes
3/27/2008 4:54 pmHi,
Thanks for a great post. It got me to thinking that I’ll test my links for longer and watch more closely the impact.
It’s interesting that the greatest click throughs I get are those where I’ve posted pictures of my family events (nothing to do with the newsletter per se, just me humanizing me for my readers) and to free self-assessments I do for people each month. I’m now thinking that maybe buttons on those links that I have seriously monetized, may entice more clicks. Definitely something for me to test!
Thanks again for pushing me to think a bit more laterally about my testing.
Baba Gaye
3/28/2008 11:45 amHi
I wanted to register for your free one-hour seminar on Split Testing Emails and Web Forms.I missed it.
Please, I’m very intersted, so let me know for scuh events next time.
Dave Ryan
3/28/2008 2:32 pmNice case study…
I’ve performed very similar studies and found that there’s a lot of variables at play depending on what market you’re using.
Also, there were different results in different markets depending on the use of serif fonts or sans-serif fonts.
As you said though, the best way to figure out what works is to test.
Doug Barger
3/28/2008 4:58 pmHi Justin,
Thank you and Marc for sharing your results with us.
Those that didn’t read to the conclusion of the article may have thought that the buttons won hands down.
I think it shows that you care about the success of your subscribers and the test results being shared is a nice gesture and generous content.
Thank you!
Using Buttons and Text Links to Improve Conversion
3/29/2008 10:36 am[…] Jason Premick at Aweber has posted the results of a split test he conducted on the use of buttons vs text links: Do Buttons Get Clicked More Than Text Links? […]
AutoConversion Blog | Online Brand Marketing, Search Engine Marketing, Automotive SEO, Affiliate Marketing Programs » AutoConversion » *Buttons or Links: Which One Brings More Traffic?
3/29/2008 11:11 pm[…] the company posted a blog about split testing. A couple guys in the company had a debate about which one causes visitors to […]
kathleen
3/30/2008 1:19 amI am curious just what studies are done and what they are looking for.I am excited to know what makes them happy with the input that is given by each person thanks for your time kathleen
Online Testing - Buttons vs. Text Links | eCommerce Small Business Startup Consulting & Self Motivation
3/31/2008 9:50 pm[…] recently found a cool post while twittering. It is titled “Do Buttons Get Clicked More Than Text Links? A Case Study” by Justin Premick. It talks all about testing in regards to an email campaign. Justin and […]
peter hobday
4/15/2008 12:18 pmJustin – this is a superb test. Many thanks for the results.
The findings are very interesting and it is unlikely anyone could have predicted them!
Brilliant – thank you!
shrestha12
10/19/2008 3:30 amButtons look fancier and more attractive than old plain Jane links.It’s more fun and stylish so it’s more clickable.
Email Testing Inspiration: Tips and a Recent Example - Inbox Ideas: Email Marketing Tips by AWeber
4/6/2009 2:50 pm[…] I think there is value in testing minor changes (we’ve even tested button design ourselves), making larger changes is a better place to start for most […]
Email Marketing: Testing Recommendations « SCORE Ask an Expert Blog
5/8/2009 9:09 am[…] Create multiple signup forms for the same landing page and test the subscription forms. Do
Sjoerd
6/9/2009 10:14 amVery interesting.
I also find textlinks the better performers.
Especially if the location any match with content is logical.
This way better results for affiliate-marketing.
Want More Sales? Enhance Your Conversion Rate With Easy-to-Follow Navigation and Strong Calls-to-Action | The Adventures of SEO Boy
7/13/2009 8:34 am[…] are several case studies out there on how calls to action can improve conversion rates, but I have a case study of my own to […]
O
8/10/2009 3:36 pmThanks for posting the split testing method and outcome. We have created several sites for various promotional product companies and they all seem very adamant about having large buttons that blink or have movement to catch the visitors eye. Personally, I like small non obtrusive buttons or text links. having overly large buttons or text is like typing in all caps or reading a childrens book.
However they insist on it and can vouch for it by their pay per click campaigns. We can only offer advise and in the end give them waht they ask for to the best of our ability.
Buttons or Links: Which One Brings More Traffic? | AutoConversion Blog
1/3/2010 1:18 am[…] the company posted a blog about split testing. A couple of guys in the company debated
Sal Conigliaro
1/28/2010 6:17 pmThose are interesting results. When the multiple tests were run, did the button size/shape/color change, or was it constant? Did the text link location and/or wording change during those successive runs?
steve
2/15/2010 3:08 amVery interesting test,
now maybe a lesson on how to create buttons?
Tuinhuis
2/18/2010 3:06 pmMy experience is that when using a new button, it works very good, but after an while the ctr drops. With tekstlinks, it stays more at the same level. That’s why I like to use more tekstlinks.
Esther
4/11/2010 8:22 amIn my experience, textlinks work better on affiliate websites. CTR is about the same as buttons, but the quality is way higher.
Louisa Chan
6/29/2010 11:22 amThis is a very informative post and thanks for highlighting what we can do with split testing. Isn’t it interesting that we cannot trust what we see? Lesson learned: Need to continue to test and monitor.
That does seem to imply that we need to introduce new things to avoid people from getting numb.
kari
9/12/2010 6:29 pmVery useful info. Thanks as always!
Freddie
10/27/2010 7:20 amTesting is obviously very important. It is amazing the results you have found in this test. I would have picked the other button.
John
10/27/2010 9:58 pmTesting is awesome. splittesting can really help your site get better conversion.
Laura Benjamin
11/25/2010 8:48 amI woke up this morning determined to listen to what my market has been asking for – and I haven’t been paying attention. Your article will help me take the next step – to adapt my branding and jump start a fresh approach to supporting and attracting readers. I’ll report back on what happened after a few weeks of concentrated effort! Thank you again for all the great info on your site. I’ve used a number of email marketing services in the past, but have fallen in love with you folks!
Elizabeth Ellis
3/1/2011 7:14 pmI found it interesting that you said the people became desensitized to the image over time. It is amazing to me what a big difference the smallest changes can make. I would lie to learn more about how I could do slip-testing for various feature for my website. Thanks for sharing this information.
Justin Premick
3/2/2011 11:06 amElizabeth,
I think there’s a lot to be said for periodic testing and updating of elements of your email campaigns, as well as other marketing assets.
I would not at all be surprised to find a similar desensitization among people viewing ads on your website, viewing ads you are running to get people to your website, or even reading your subject lines (if you are regularly including the same things in your subjects – name personalization, your company name, ALL CAPS, and so on). There’s something to be said for giving your content a refresh now and then.
garry robinson
4/11/2012 11:30 amEveryone. Buttons, split testing, etc. Always remember that readers want good content so don’t get carried away with all this testing if it causes your content quality and quantity to drop. Google searches is where we get our visitors from.
Kasim
9/5/2012 9:12 amThe test tell me that it’s a novelty effect. If the button out-performs for 2 weeks, then I would use it for 2 weeks a year.
These results remind of what happens to the take up rate of a product during and after a launch – you get a spike during a launch which then quickly comes down to normal levels.
So, you have to keep launching on a regular basis giving people time to forget the previous launch.