Site icon AWeber

Do Buttons Get Clicked More Than Text Links?

Many of our readers have already signed up to the live seminar on split testing that we announced last week.

But even if you can’t make it, you’re probably interested in learning more about split testing now, right?

Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.

Today, let’s look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.

The Test

Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.

One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”

Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.

So, Marc created a button-shaped image with the words “Read More” stamped on it:

We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.

The emails were otherwise identical ? we kept subject lines, sending dates/times and templates the same for each version.

Measuring Our Success

Clicks-to-Opens: clicks divided by opens.More email statistics.

Since we’re trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.

We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.

That way, if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s call to action look more effective than it really was.

Our Expectations

Going into this test, we expected the button to beat the text links handily.

Why?

It was physically larger than the text link
It contained a clear call to action ? “Read More” ? while a contextual link might be less obvious.
It was an image placed in a part of the email where readers hadn’t previously been shown images

Basically, we expected the button would grab people’s attention as they scanned through the email.

On the flipside, we knew that readers might have images disabled and wouldn’t see the button.

So we added the ALT text “Read More” to the button image.

Since with images disabled the text “Read More” would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.

Initial Results

In our first split test, the button call to action outperformed our text link by 51.4%.

We started running the test on our Blog Broadcasts last year.

As we expected, the button grabbed readers’ attention and incented them to click through, much better than the text link did.

Clicks-to-opens for the button was repeatedly higher ? a lot higher ? than it was for the text link.

In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.

At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”

…But, We Kept Going!

We could have stopped after those first few tests ? and in many cases, one-time or short-time split tests are appropriate.

However, even in our initial few tests, the text had beaten the button once, and by a large margin.

I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the “novelty” of it.

So we continued to split test our Blog Broadcasts…

Later Results

Further testing showed that using buttons instead of text was not a good long-run tactic.

We ultimately ran the button-versus-text split test about 40 times, over the course of several months.

For a while, the button continued to beat the text links ? but we noticed that it wasn’t doing so by as large a margin as it first had.

While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.

Final Outcome

With each new split test, the text asserted itself as the better call to action.

By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.

Conclusions: What Happened?

The button is effective in the short run, but after a while readers become “numb” to it and no longer respond at the same initial high rate.

Consider the following two stats from our split tests:

That first stat doesn’t seem important in and of itself ? 53% is barely over half the time.

However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true ? the button just stopped “working.”

Which brings us to conclusion #2:

Test results are not forever.

What works today may not work tomorrow.

Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.

It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.

Finally, one last point I feel obligated to make:

What works for someone else may not work for you.

The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.

Buttons may work well for you in the short run. Split test them.

We send our blog newsletter often ? 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.

If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.

Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.

Again, don’t just take our word for it. Find out for yourself through your own testing.

What Do You Think?

What are your thoughts on this case study?

Have you tested calls to action or other elements of your email newsletters? What were your findings?

Can you think of any other factors that may have influenced our results?

Share your reactions below!

Exit mobile version