In the most basic sense, a solid email marketing campaign starts with a captivating web form. Does your web form do everything it should to make a good first impression and compel visitors to subscribe? Or does it fall short, leaving your list smaller than it should be?
Sure, you’ve heard about the value of testing in your email marketing campaigns.
But there’s a pretty good chance you’re not testing as much as you could, or even as much as you might like to. Most marketers aren’t.
Getting in the habit of testing requires us to think not only about what we want to test in a broad sense, but also what tests to conduct (and how to conduct them) so that the results are useful to us.
Recently, I found a couple articles that I thought would help you get more out of your existing email testing – or simply start conducting more tests to optimize your campaigns.
Email Testing Tips
While at first glance this presentation seems targeted to retailers and similar businesses, the key points are useful to anyone using email marketing to increase sales.
You’ll note that in the presentation, Linda argues against testing minor things (like the design of your call-to-action button) in favor of testing major changes (like the overall layout of your email).
While I think there is value in testing minor changes (we’ve even tested button design ourselves), making larger changes is a better place to start for most people:
- The larger changes in your results that you’re likely to see when testing larger changes instead of smaller ones can help inspire you to continue testing
- It’s easier to start with broad changes and drill down to smaller ones in future tests than it is to test something minor and try to expand logically to larger tests. (Example: if you test multiple offers in one email vs. one offer, you can later test offer order and location; if all you test is whether adding an exclamation point lifts sales, where do you go from there – 2 exclamation points? 3?)
She also discusses “smart” landing pages. This is a great point that some people miss when testing:
If you test a different offer or content to different groups, make sure you’re not just dumping everybody on the same page.
There’s almost certainly an increase in conversions to be had by making what you say in each version of the email you’re testing more relevant to the landing pages you link to from each version.
How to Test and Track Web Forms
Not only does he give us an example of how you can track where subscribers are coming from, he also points out a few other uses of web form tracking that you can use to test the effectiveness of form location, web page copy and other aspects of online list-building. Useful stuff, especially if you haven’t done much web form testing.
Of course, there are certainly other things to consider when creating a signup form
Any Other Email Testing Inspiration?
Have you come across an example of a useful or thought-provoking email test, or tips on testing? Share them on the blog!
As a reader of this blog, you might occasionally wonder, “where is email marketing headed?”
Me too. So to see what others were doing with their email campaigns, Jay Moore (you’ve probably heard him on one of our free live webinars recently) and I recently headed to Miami for MarketingSherpa’s 2009 Email Summit.
After several long days jam-packed with case studies, presentations and chats about email marketing, we walked away with our brains full.
Here are my biggest takeaways from the Summit:
The Relevancy Challenge: Can You Break Through The Inbox Clutter?
The Summit kick-off featured a MarketingSherpa survey that found nearly three-fourths of consumers reporting a noticeable increase in email volume from subscriptions.
You might expect that this to mean subscribers are getting overwhelmed and simply unsubscribing en masse to cut back on email, regardless of who it’s from.
However, the survey also revealed that the #1 reason for unsubscribing was not email overload, but lack of relevance. On top of that, another popular reason for unsubscribing – “the email doesn’t apply to me” – further underscores the need to create relevant emails and stand out in the inbox.
Numerous other discussions touched on increasing relevance not only to reduce unsubscribes, but to lift conversions.
In one case study, retailer L’Occitane en Provence demonstrated how a simple use of email analytics – personalizing a promotion to subscribers with a picture of a product they had recently viewed – yielded a 2300% increase in profit (yes, 23 hundred percent) over a less relevant control email.
(By the way, AWeber can segment your subscribers based on what web pages they’ve viewed; that’s one of the features of our Email Web Analytics tools. Hopefully a 2300% increase in profit gets you thinking about how you might segment with that data )
Read more about email relevance.
Content Is King Again…
…or still, as many of you will already know.
One of the fascinating and heartening takeaways for me was the focus on quality content by several speakers.
Given that many big brands attended the Summit, I anticipated a strictly promotional email focus, but there was a lot of discussion of email newsletters and of welcome series (autoresponders) for new subscribers.
As competition for email users’ attention gets fiercer, the need to provide valuable content (which to subscribers is typically more relevant than promotions) increases.
In a down economy, with many businesses and marketers ramping up email volume, subscribers are more likely to open, read and click through from emails from senders who consistently deliver valuable content.
…and Testing Matters More Than Ever
I lost count of how many presentations focused on test results (half the presentations were case studies, which nearly always included test results), or on the gains to be had by testing.
To adapt a line from the movie Glengarry Glen Ross, a central theme at the Summit was “A.B.T. — Always Be Testing.” And I agree:
There’s almost never an email campaign where you shouldn’t be testing something – subject line copy, different styles and locations for your calls to action, adding a headshot to your email (one of our social networking tactics for email marketers), or whatever other split tests you believe may impact your campaigns’ effectiveness.
Read more about split testing.
More Takeaways From Email Summit
I amped up my usage of Twitter during the Summit as it was a great way to get bits of information out in real time (and to serve as an informal notepad).
Were you at the Summit? If so, what were your takeaways?
If you’ve been subscribing to email marketing campaigns for any length of time, you’ve probably experienced personalization several times.
How much of it impresses you? How much of it makes the email feel “personal?”
Yeah… me too.
While they’re not always the right solution for all sites, some businesses have found they can increase opt-in rates significantly by adding popup forms.
Of course, getting the best results from your popup (just like any other aspect of your campaign) requires testing.
Increase Opt-In Rates For Your Popups: What to Test?
The other day, an AWeber user asked me if I had any suggestions on what he might test in order to increase the opt-in rate for his popup1 form.
I sent along some suggestions that you may find helpful as well.
If you’re using popups now and want to make them better, test these modifications:
- Put an image in the popup.
The idea behind this isn’t necessarily to get the visitor’s attention – the popup itself will do that (at least momentarily).
The image should keep that attention long enough to get the subscriber to read your form headline (tip: don’t make the image your entire headline – keep is small and let your text do the convincing).
If you brand yourself on your website, try using a headshot in your form.
- Change the popup delay.
You don’t have to have the form appear immediately when someone comes to your site.
To start, test forms with significant (but not ridiculous) delay differences – say, 15 seconds vs. 30 seconds vs. 45 seconds.
Once you have a winner, narrow down – maybe eventually as far as 5 second differences.
Another approach here: rather than starting from 0 seconds, look at your website stats and start with the average amount of time that a visitor is on your page. Test forms with delays equal to that amount of time vs. forms with shorter and longer delays (start with say, 15-20 seconds on either side of your average visit length).
- Change how the popup enters the page.
Does your current form pop immediately into the page?
Test it against a form that fades into the page, or slides in from above, below or either side.
Depending where on your page visitors’ eyes are focused when your form appears, how it appears (and how suddenly it does so) may affect whether they immediately close it or read and complete it.
Other Popup Tests: Your Suggestions?
Are you running split tests on popups? What have you found?
1 (Please note: when I say “popup” I don’t necessarily mean a traditional popup that appears in a new window. In this case I’m referring mainly to “popover” style forms that simulate a popup window. See this video on web forms for an example.)
The reason you hear it so often is because when it comes to email marketing (as well as any other marketing channel), testing separates the pros from the Joes. It’s one thing to think we know what works best, but when we apply a little bit of scientific method to our marketing, we not only find out for sure, we learn more about our visitors and subscribers — and that helps us predict more accurately what will work in the future.
The challenge for a lot of people (including us at AWeber) is deciding what to test. There are simply so many small changes we can make to our forms, messages and other parts of our campaigns, that it’s easy to get stuck on deciding where to start.
So, to help you get started with split testing (or to get back into it if you’ve gotten complacent and stopped testing regularly), here are seven split tests you can run on your website to get and retain more subscribers, lower spam complaints, and increase response.
7 Split Tests
Give these a try and see how they affect your subscribers’ response (not to mention your perception of your subscribers).
- Create one signup form where you ask for name first, then email, and one where you ask for email first, then name.
See if the order that you ask for information affects how many people sign up.
- Send one broadcast with personalization in the subject line, and one without.
Do subscribers respond to personalization, or do they see it as a “gimmick?”
- Split your next message into three broadcasts with different sending times: one between 8AM and 9AM, one between 12PM and 1PM, and one between 4PM and 5PM. Compare open/click rates for each message.
Find out what time of day your subscribers prefer to hear from you.
- Try using a different call to action on your signup form besides the old classic “Submit.”
Come up with 2-3 short phrases, create your forms and compare opt-in rates (a couple options: “Sign Me Up”, “Send Me _____”, “Keep Me Informed”). Keep whatever you’re using now too, and make it the “control” in your experiment.
Not everyone wants to submit to getting email from you. Find out what trigger they respond to.
Compare opt-in rates for those forms against a form where you make no privacy statement.
Are visitors more likely to sign up if you tell them you will treat their inbox with respect, and differentiate your email practices from others’?
- In your next HTML email, test using a button for your call to action against using a text link.
Is a well-written text link more compelling than a colorful, more prominent button?
- For your next broadcast, add a permission reminder (“you’re receiving this email because you signed up at ____” etc) in the message. Compare your clickthrough rates, and your spam complaint rates.
Does reminding people why they’re receiving an email make them any more likely to recognize and trust you? Does it make them more likely to read through your email and/or click on links in it?
You might have some interesting findings (if so, please share them!)…
…but even if you don’t — even if you run all 7 of these split tests and none of them bring immediate, significant changes to your campaign — you’ll still be more familiar with split testing than you were yesterday, and better prepared to test and improve your campaigns in the future.
What Do YOU Split Test?
Are there other split tests that you run regularly?
Many of our readers have already signed up to the live seminar on split testing that we announced last week.
But even if you can’t make it, you’re probably interested in learning more about split testing now, right?
Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.
Today, let’s look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.
Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.
One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”
Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.
So, Marc created a button-shaped image with the words “Read More” stamped on it:
We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.
The emails were otherwise identical — we kept subject lines, sending dates/times and templates the same for each version.
Measuring Our Success
Since we’re trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.
We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.
That way, if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s call to action look more effective than it really was.
Going into this test, we expected the button to beat the text links handily.
Basically, we expected the button would grab people’s attention as they scanned through the email.
On the flipside, we knew that readers might have images disabled and wouldn’t see the button.
So we added the ALT text “Read More” to the button image.
Since with images disabled the text “Read More” would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.
We started running the test on our Blog Broadcasts last year.
As we expected, the button grabbed readers’ attention and incented them to click through, much better than the text link did.
Clicks-to-opens for the button was repeatedly higher — a lot higher — than it was for the text link.
In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.
At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”
…But, We Kept Going!
We could have stopped after those first few tests — and in many cases, one-time or short-time split tests are appropriate.
However, even in our initial few tests, the text had beaten the button once, and by a large margin.
I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the “novelty” of it.
So we continued to split test our Blog Broadcasts…
We ultimately ran the button-versus-text split test about 40 times, over the course of several months.
For a while, the button continued to beat the text links — but we noticed that it wasn’t doing so by as large a margin as it first had.
While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.
With each new split test, the text asserted itself as the better call to action.
By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.
Conclusions: What Happened?
Consider the following two stats from our split tests:
- Overall, text links outperformed buttons 53% of the time.
- After an initial period where the button was more effective, text links outperformed buttons 67% of the time.
That first stat doesn’t seem important in and of itself — 53% is barely over half the time.
However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true — the button just stopped “working.”
Which brings us to conclusion #2:
What works today may not work tomorrow.
Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.
It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.
Finally, one last point I feel obligated to make:
The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.
We send our blog newsletter often — 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.
If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.
Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.
Again, don’t just take our word for it. Find out for yourself through your own testing.
Learn More About Split Testing
Unfamiliar with split testing? Want to see how you can methodically raise response rates for your email marketing campaigns?
Join us for a free one-hour seminar on split testing tomorrow — Wednesday, March 26th:
What Do You Think?
What are your thoughts on this case study?
Have you tested calls to action or other elements of your email newsletters? What were your findings?
Can you think of any other factors that may have influenced our results?
Sometimes when we work on our email newsletters, we’re distracted from actually writing content by the endless possibilities our imaginations present.
We wonder things like if one call to action might promote more clicks than another, or if a message with subject A might get more opens than one with subject B.
Well, what they say is true: you’ll never know until you try. And yet, split testing – the best way to experiment and obtain information that helps us to optimize our campaigns – is an underutilized tool in email marketing, especially by those relatively new to email marketing.
What Makes it So Scary?
I think people avoid split testing because it can sound complicated, like something that should be done in a lab, with results run through spreadsheets and statistics software.
The truth is, a split test trial is not hard to execute, the results are easy to read and understand, and they’re generally well worth your time.
After all, the more we understand about our website visitors and subscribers, the better we can target them. While surveying or soliciting feedback are ways to learn a bit about them, split testing provides a more objective, birds eye view.
Split Testing: A Practical Introduction
If you’ve ever wondered about split testing but aren’t sure where to start, there’s a solution approaching quickly on the horizon.
Join our Education Team for a live seminar on this topic. We’ll provide a quick step-by-step walk-through on how to test, along with some ideas you can try just as soon as you’re out the door:
We’ll cover what, why, and when you should split test both sign up forms and email messages.
Other Split Testing Resources
If you can’t join us for the seminar, sign up anyway, and we’ll send you a video recording of the session afterwards.
Or, if you’d like to learn a little more now, you can start in our knowledge base.
Hope to see you on the call!
Setting up your first email campaign and not sure what to do? Need a refresher course on how to put together a simple yet effective email marketing program to convert your website traffic into loyal customers?
Our own Sean Cohen turned up on the Recognized Expert Marketing Show last week with plenty of great advice for budding — and established — businesses.
Listen to Sean’s Email Marketing Advice
In this 34-minute interview, Sean talks to host Bob Sommers about:
Interesting conversation going on over at Copyblogger where Brian Clark is talking about calls to action.
He points to a MarketingSherpa test that suggests that using the word “click” — as in “click here” or “click to continue” — can boost response.
You can make a lot of arguments for or against using that specific wording (and people appear to be doing so in the comments).
But what stood out to me as the real lesson is…