split testing

Email Testing Inspiration: Tips and a Recent Example

By Justin Premick

Testing...Sure, you’ve heard about the value of testing in your email marketing campaigns.

You’re familiar with some of the big gains that tests can bring you. Perhaps you’ve even looked over some of the examples of things you can split test and considered doing them.

But there’s a pretty good chance you’re not testing as much as you could, or even as much as you might like to. Most marketers aren’t.

Getting in the habit of testing requires us to think not only about what we want to test in a broad sense, but also what tests to conduct (and how to conduct them) so that the results are useful to us.

Recently, I found a couple articles that I thought would help you get more out of your existing email testing – or simply start conducting more tests to optimize your campaigns.

Email Testing Tips

Linda shares a presentation she gave on email testing.

While at first glance this presentation seems targeted to retailers and similar businesses, the key points are useful to anyone using email marketing to increase sales.

You’ll note that in the presentation, Linda argues against testing minor things (like the design of your call-to-action button) in favor of testing major changes (like the overall layout of your email).

While I think there is value in testing minor changes (we’ve even tested button design ourselves), making larger changes is a better place to start for most people:

  • The larger changes in your results that you’re likely to see when testing larger changes instead of smaller ones can help inspire you to continue testing
  • It’s easier to start with broad changes and drill down to smaller ones in future tests than it is to test something minor and try to expand logically to larger tests. (Example: if you test multiple offers in one email vs. one offer, you can later test offer order and location; if all you test is whether adding an exclamation point lifts sales, where do you go from there – 2 exclamation points? 3?)

She also discusses “smart” landing pages. This is a great point that some people miss when testing:

If you test a different offer or content to different groups, make sure you’re not just dumping everybody on the same page.

There’s almost certainly an increase in conversions to be had by making what you say in each version of the email you’re testing more relevant to the landing pages you link to from each version.

How to Test and Track Web Forms

Darren discusses how he uses multiple forms to test what gets people to subscribe to his newsletter.

Not only does he give us an example of how you can track where subscribers are coming from, he also points out a few other uses of web form tracking that you can use to test the effectiveness of form location, web page copy and other aspects of online list-building. Useful stuff, especially if you haven’t done much web form testing.

Of course, there are certainly other things to consider when creating a signup form :)

Any Other Email Testing Inspiration?

Have you come across an example of a useful or thought-provoking email test, or tips on testing? Share them on the blog!

Twitter Tweet This

Increase Profit 2300% and Other Key Takeaways From MarketingSherpa’s Email Summit

By Justin Premick

IncreaseAs a reader of this blog, you might occasionally wonder, “where is email marketing headed?”

Me too. So to see what others were doing with their email campaigns, Jay Moore (you’ve probably heard him on one of our free live webinars recently) and I recently headed to Miami for MarketingSherpa’s 2009 Email Summit.

After several long days jam-packed with case studies, presentations and chats about email marketing, we walked away with our brains full.

Here are my biggest takeaways from the Summit:

The Relevancy Challenge: Can You Break Through The Inbox Clutter?

The Summit kick-off featured a MarketingSherpa survey that found nearly three-fourths of consumers reporting a noticeable increase in email volume from subscriptions.

You might expect that this to mean subscribers are getting overwhelmed and simply unsubscribing en masse to cut back on email, regardless of who it’s from.

However, the survey also revealed that the #1 reason for unsubscribing was not email overload, but lack of relevance. On top of that, another popular reason for unsubscribing – “the email doesn’t apply to me” – further underscores the need to create relevant emails and stand out in the inbox.

Numerous other discussions touched on increasing relevance not only to reduce unsubscribes, but to lift conversions.

In one case study, retailer L’Occitane en Provence demonstrated how a simple use of email analytics – personalizing a promotion to subscribers with a picture of a product they had recently viewed – yielded a 2300% increase in profit (yes, 23 hundred percent) over a less relevant control email.

(By the way, AWeber can segment your subscribers based on what web pages they’ve viewed; that’s one of the features of our Email Web Analytics tools. Hopefully a 2300% increase in profit gets you thinking about how you might segment with that data :) )

Read more about email relevance.

Content Is King Again…

…or still, as many of you will already know.

One of the fascinating and heartening takeaways for me was the focus on quality content by several speakers.

Given that many big brands attended the Summit, I anticipated a strictly promotional email focus, but there was a lot of discussion of email newsletters and of welcome series (autoresponders) for new subscribers.

As competition for email users’ attention gets fiercer, the need to provide valuable content (which to subscribers is typically more relevant than promotions) increases.

In a down economy, with many businesses and marketers ramping up email volume, subscribers are more likely to open, read and click through from emails from senders who consistently deliver valuable content.

…and Testing Matters More Than Ever

I lost count of how many presentations focused on test results (half the presentations were case studies, which nearly always included test results), or on the gains to be had by testing.

To adapt a line from the movie Glengarry Glen Ross, a central theme at the Summit was “A.B.T. — Always Be Testing.” And I agree:

There’s almost never an email campaign where you shouldn’t be testing something – subject line copy, different styles and locations for your calls to action, adding a headshot to your email (one of our social networking tactics for email marketers), or whatever other split tests you believe may impact your campaigns’ effectiveness.

Read more about split testing.

More Takeaways From Email Summit

I amped up my usage of Twitter during the Summit as it was a great way to get bits of information out in real time (and to serve as an informal notepad).

  1. See my Email Summit tweets
  2. All Email Summit-tagged tweets

Were you at the Summit? If so, what were your takeaways?


Twitter Facebook Delicious StumbleUpon Sphinn Technorati

{!firstname}, Think Before You Personalize

By Justin Premick

If you’ve been subscribing to email marketing campaigns for any length of time, you’ve probably experienced personalization several times.

How much of it impresses you? How much of it makes the email feel “personal?”

Yeah… me too.

Lately, I’m wondering whether as email marketers, we’ve allowed ourselves to get lazy with personalization, and whether we can do better.

I can hear some of you thinking,

“But personalization gets more opens and clicks!”

But Does It?

I’ve heard numerous marketers say it does. And it’s entirely possible – if not likely – that at least some of them regularly test this and continue to find it to be true.

But when’s the last time you tested it?

I’ll be honest here and say I haven’t tested it in quite a while – partly because other tests are more interesting or exciting (like testing what happens when you add social networking links to your emails).

Besides… Is The Click All That Matters?

A recent pair of articles has me thinking about what it means, in practical terms, to take a long-term approach to email.

I’m subscribed, as I’m sure you are, to many email lists. Many of the campaigns I receive have subject lines like:

  • Justin, Do You Have a Minute?
  • Exclusive Savings for Justin
  • Justin – Good news and bad news :(
  • Hi Justin
  • JUSTIN, Save 30% For Two Days Only!

Now, let’s face it: a lot of these emails would get the average person to open them. I opened them.

But does that mean they’re a good idea? What do you think of someone when they send you an email with those subjects?

  • Justin, Do You Have a Minute? – I did. And I just spent it on your email. Was it worth it?
  • Exclusive Savings for Justin – Is it for all people with my name? Is this National Justin Day? Why not just say “Exclusive Saving for You?”

    Personalization here, while it might get more opens, makes no sense when you read it.

  • Justin – Good news and bad news :( – Good/bad news for whom? This one isn’t the worst I’ve seen, but if the news isn’t really good or bad from the subscribers’ point of view, then you’re taking a very “me-centric” approach to your relationship with subscribers. Not good.
  • Hi Justin – this screams “I’m spam!”

    There’s technically nothing deceiving about saying Hi to someone in the subject line, but… it just feels wrong. It feels like a subject line that a long-lost friend or relative would use to reopen communication with you after disappearing for years.

    Wouldn’t you be mad to get an email with that subject, open it, and find it was an email campaign?

  • JUSTIN, Save 30% For Two Days Only! – Quick personalization tip: Don’t put my name in ALL CAPS, even if that’s how it is in your database.

    This is why in AWeber, you can use the “fix” version of several variables (example: {!firstname_fix}) to correct any incorrect capitalization.

How much more likely are you going to be to unsubscribe if you get an email with a subject like these? How much more likely to click “Spam?” How much less likely to open other emails later, or recommend that company to someone else?

Isn’t There More To A Truly “Personal” Email Than A Name?

Personalization isn’t a bad thing in and of itself. But when it gets misused for the sake of an extra open or click, it becomes a bad thing. It also becomes less effective over time. And it allows us to think that we’re creating “personal” emails just by merging a name into the message.

A truly personal email addresses the subscriber’s needs, desires, fears, preferences and other aspects of their personality.

Truly personal emails look at things like:

  • Which emails an individual subscriber has opened and clicked through from in the past
  • Where on your site s/he visits
  • How s/he originally found you and what inspired him/her to sign up to your list
  • And a lot more things that aren’t coming to mind at the moment

A lot of this isn’t typically considered personalization – it falls more under discussions of segmentation and targeting. But I think it’s worth considering that relevance and personalization are somewhat interchangeable when we think about it from the subscriber’s perspective, and not our own. A relevant email is personal, and a personal email is relevant.

Making truly personal emails isn’t easy. And I don’t profess to be the example to follow; I’m going to be re-examining a lot of the emails I send here at AWeber as a result of this discussion.

Care to do the same?


Twitter Facebook Delicious StumbleUpon Sphinn Technorati

Get More Subscribers With These 3 Popup Form Split Tests

By Justin Premick

Popup IconDo you use popup forms to collect subscribers faster and kick-start your email marketing campaign?

While they’re not always the right solution for all sites, some businesses have found they can increase opt-in rates significantly by adding popup forms.

Of course, getting the best results from your popup (just like any other aspect of your campaign) requires testing.

Increase Opt-In Rates For Your Popups: What to Test?

The other day, an AWeber user asked me if I had any suggestions on what he might test in order to increase the opt-in rate for his popup1 form.

I sent along some suggestions that you may find helpful as well.

If you’re using popups now and want to make them better, test these modifications:

  1. Put an image in the popup.

    The idea behind this isn’t necessarily to get the visitor’s attention – the popup itself will do that (at least momentarily).

    The image should keep that attention long enough to get the subscriber to read your form headline (tip: don’t make the image your entire headline – keep is small and let your text do the convincing).

    If you brand yourself on your website, try using a headshot in your form.

  2. Change the popup delay.

    You don’t have to have the form appear immediately when someone comes to your site.

    To start, test forms with significant (but not ridiculous) delay differences – say, 15 seconds vs. 30 seconds vs. 45 seconds.

    Once you have a winner, narrow down – maybe eventually as far as 5 second differences.

    Another approach here: rather than starting from 0 seconds, look at your website stats and start with the average amount of time that a visitor is on your page. Test forms with delays equal to that amount of time vs. forms with shorter and longer delays (start with say, 15-20 seconds on either side of your average visit length).

  3. Change how the popup enters the page.

    Does your current form pop immediately into the page?

    Test it against a form that fades into the page, or slides in from above, below or either side.

    Depending where on your page visitors’ eyes are focused when your form appears, how it appears (and how suddenly it does so) may affect whether they immediately close it or read and complete it.

Other Popup Tests: Your Suggestions?

Are you running split tests on popups? What have you found?

Share your findings and suggestions!


1 (Please note: when I say “popup” I don’t necessarily mean a traditional popup that appears in a new window. In this case I’m referring mainly to “popover” style forms that simulate a popup window. See this video on web forms for an example.)

7 Split Tests You Can Implement Today

By Justin Premick

Split TestingHow many times have you heard “you should test to see what works best for you” or something to that effect? Probably too many to count, right?

The reason you hear it so often is because when it comes to email marketing (as well as any other marketing channel), testing separates the pros from the Joes. It’s one thing to think we know what works best, but when we apply a little bit of scientific method to our marketing, we not only find out for sure, we learn more about our visitors and subscribers — and that helps us predict more accurately what will work in the future.

The challenge for a lot of people (including us at AWeber) is deciding what to test. There are simply so many small changes we can make to our forms, messages and other parts of our campaigns, that it’s easy to get stuck on deciding where to start.

So, to help you get started with split testing (or to get back into it if you’ve gotten complacent and stopped testing regularly), here are seven split tests you can run on your website to get and retain more subscribers, lower spam complaints, and increase response.

7 Split Tests

Give these a try and see how they affect your subscribers’ response (not to mention your perception of your subscribers).

See our Knowledge Base for instructions on how to create a web form split test and a broadcast split test.

  1. Create one signup form where you ask for name first, then email, and one where you ask for email first, then name.

    See if the order that you ask for information affects how many people sign up.

  2. Send one broadcast with personalization in the subject line, and one without.

    Do subscribers respond to personalization, or do they see it as a “gimmick?”

  3. Split your next message into three broadcasts with different sending times: one between 8AM and 9AM, one between 12PM and 1PM, and one between 4PM and 5PM. Compare open/click rates for each message.

    Find out what time of day your subscribers prefer to hear from you.

  4. Try using a different call to action on your signup form besides the old classic “Submit.”

    Come up with 2-3 short phrases, create your forms and compare opt-in rates (a couple options: “Sign Me Up”, “Send Me _____”, “Keep Me Informed”). Keep whatever you’re using now too, and make it the “control” in your experiment.

    Not everyone wants to submit to getting email from you. Find out what trigger they respond to.

  5. Add a privacy statement (i.e. “we will not share your email address…”) to your signup form. Create another form where you instead link to a privacy policy on another page of your site.

    Compare opt-in rates for those forms against a form where you make no privacy statement.

    Are visitors more likely to sign up if you tell them you will treat their inbox with respect, and differentiate your email practices from others’?

  6. In your next HTML email, test using a button for your call to action against using a text link.

    Is a well-written text link more compelling than a colorful, more prominent button?

  7. For your next broadcast, add a permission reminder (“you’re receiving this email because you signed up at ____” etc) in the message. Compare your clickthrough rates, and your spam complaint rates.

    Does reminding people why they’re receiving an email make them any more likely to recognize and trust you? Does it make them more likely to read through your email and/or click on links in it?

You might have some interesting findings (if so, please share them!)…

…but even if you don’t — even if you run all 7 of these split tests and none of them bring immediate, significant changes to your campaign — you’ll still be more familiar with split testing than you were yesterday, and better prepared to test and improve your campaigns in the future.

What Do YOU Split Test?

Are there other split tests that you run regularly?

Share them with your fellow email marketers!

Do Buttons Get Clicked More Than Text Links?

By Justin Premick

Many of our readers have already signed up to the live seminar on split testing that we announced last week.

But even if you can’t make it, you’re probably interested in learning more about split testing now, right?

Fortunately, we happen to have a case study on hand that shows just the sort of information you can learn about your email marketing campaigns by conducting split tests.

Today, let’s look at a split test that we ran on our own blog newsletter to get more of you to come to the site and read the latest posts.

The Test

Last year, Marc and I were discussing how to increase clickthroughs on the emails we send to our blog subscribers.

One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”

Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email.

So, Marc created a button-shaped image with the words “Read More” stamped on it:

We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.

The emails were otherwise identical — we kept subject lines, sending dates/times and templates the same for each version.

Measuring Our Success

Clicks-to-Opens: clicks divided by opens.

More email statistics.

Since we’re trying to get people to visit our site to read the full blog posts, we want to compare clickthrough rates.

We chose to use clicks-to-opens as our measure of success. Instead of dividing clickthroughs by the number of emails sent, we divided it by the number of emails opened.

That way, if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s call to action look more effective than it really was.

Our Expectations

Going into this test, we expected the button to beat the text links handily.

Why?

It was physically larger than the text link
It contained a clear call to action — “Read More” — while a contextual link might be less obvious.
It was an image placed in a part of the email where readers hadn’t previously been shown images

Basically, we expected the button would grab people’s attention as they scanned through the email.

On the flipside, we knew that readers might have images disabled and wouldn’t see the button.

So we added the ALT text “Read More” to the button image.

Since with images disabled the text “Read More” would appear in place of the button, we felt that even for those readers with images disabled, the button should do at least approximately as well as the text link.

Initial Results

In our first split test, the button call to action outperformed our text link by 51.4%.

We started running the test on our Blog Broadcasts last year.

As we expected, the button grabbed readers’ attention and incented them to click through, much better than the text link did.

Clicks-to-opens for the button was repeatedly higher — a lot higher — than it was for the text link.

In our first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.

At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”

…But, We Kept Going!

We could have stopped after those first few tests — and in many cases, one-time or short-time split tests are appropriate.

However, even in our initial few tests, the text had beaten the button once, and by a large margin.

I wanted to see whether the button was doing better because it was a more compelling call to action in general, or because of the “novelty” of it.

So we continued to split test our Blog Broadcasts…

Later Results

Further testing showed that using buttons instead of text was not a good long-run tactic.

We ultimately ran the button-versus-text split test about 40 times, over the course of several months.

For a while, the button continued to beat the text links — but we noticed that it wasn’t doing so by as large a margin as it first had.

While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.

Final Outcome

With each new split test, the text asserted itself as the better call to action.

By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.

Conclusions: What Happened?

The button is effective in the short run, but after a while readers become “numb” to it and no longer respond at the same initial high rate.

Consider the following two stats from our split tests:

  • Overall, text links outperformed buttons 53% of the time.
  • After an initial period where the button was more effective, text links outperformed buttons 67% of the time.

That first stat doesn’t seem important in and of itself — 53% is barely over half the time.

However, for the first three weeks of the experiment, the button won two-thirds of our split tests. After that, the opposite became true — the button just stopped “working.”

Which brings us to conclusion #2:

Test results are not forever.

What works today may not work tomorrow.

Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.

It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.

Finally, one last point I feel obligated to make:

What works for someone else may not work for you.

The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.

Buttons may work well for you in the short run. Split test them.

We send our blog newsletter often — 2 or 3 times a week. So we exposed subscribers to the button often, which may have increased the speed with which they started to ignore it.

If you send less frequently, or only use a button for emails where you have a particularly compelling offer, you may find it to be effective.

Plus, we tested a specific button. Perhaps another one, with different design or wording, may have been more effective.

Again, don’t just take our word for it. Find out for yourself through your own testing.

Learn More About Split Testing

Unfamiliar with split testing? Want to see how you can methodically raise response rates for your email marketing campaigns?

Join us for a free one-hour seminar on split testing tomorrow — Wednesday, March 26th:

Split Testing Emails and Web Forms

Wednesday, March 26, 2008

2:00 – 3:00PM ET

Convert to Your Time (New Window)

Register Now

What Do You Think?

What are your thoughts on this case study?

Have you tested calls to action or other elements of your email newsletters? What were your findings?

Can you think of any other factors that may have influenced our results?

Share your reactions on the blog!

Webinar: Split Testing Emails and Web Forms

By Marc Kline

a_b_test_video.png

Sometimes when we work on our email newsletters, we’re distracted from actually writing content by the endless possibilities our imaginations present.

We wonder things like if one call to action might promote more clicks than another, or if a message with subject A might get more opens than one with subject B.

Well, what they say is true: you’ll never know until you try. And yet, split testing – the best way to experiment and obtain information that helps us to optimize our campaigns – is an underutilized tool in email marketing, especially by those relatively new to email marketing.

What Makes it So Scary?

I think people avoid split testing because it can sound complicated, like something that should be done in a lab, with results run through spreadsheets and statistics software.

The truth is, a split test trial is not hard to execute, the results are easy to read and understand, and they’re generally well worth your time.

After all, the more we understand about our website visitors and subscribers, the better we can target them. While surveying or soliciting feedback are ways to learn a bit about them, split testing provides a more objective, birds eye view.

Split Testing: A Practical Introduction

If you’ve ever wondered about split testing but aren’t sure where to start, there’s a solution approaching quickly on the horizon.

Join our Education Team for a live seminar on this topic. We’ll provide a quick step-by-step walk-through on how to test, along with some ideas you can try just as soon as you’re out the door:

Split Testing Emails and Web Forms

Wednesday, March 26, 2008

2:00 – 3:00PM ET

Convert to Your Time (New Window)

Register Now!

We’ll cover what, why, and when you should split test both sign up forms and email messages.

Other Split Testing Resources

If you can’t join us for the seminar, sign up anyway, and we’ll send you a video recording of the session afterwards.

Or, if you’d like to learn a little more now, you can start in our knowledge base.

Hope to see you on the call!

AWeber’s Sean Cohen Interviewed at Recognized Expert

By Justin Premick

Recognized Expert Logo - http://www.recognizedexpert.comSetting up your first email campaign and not sure what to do? Need a refresher course on how to put together a simple yet effective email marketing program to convert your website traffic into loyal customers?

Our own Sean Cohen turned up on the Recognized Expert Marketing Show last week with plenty of great advice for budding — and established — businesses.

Listen to Sean’s Email Marketing Advice

In this 34-minute interview, Sean talks to host Bob Sommers about:

Where to Get Message Ideas and Content
Keys to Getting More Subscribers Ethically
How Split-Testing Can Help You Grow Your List Faster
Text vs. HTML: Does It Really Affect Deliverability?

Listen To The Half-Hour Interview

Test. Retest. Then Do It Again.

By Justin Premick

Test TubeInteresting conversation going on over at Copyblogger where Brian Clark is talking about calls to action.

He points to a MarketingSherpa test that suggests that using the word “click” — as in “click here” or “click to continue” — can boost response.

You can make a lot of arguments for or against using that specific wording (and people appear to be doing so in the comments).

But what stood out to me as the real lesson is…

Click to Find Out!

Split Testing: Interpreting An Example

By Justin Premick

I brought up the topic of split testing a while back. However, I
didn’t have a sample split test to refer you to at the time.

So, I went back and found an example. Let’s take a look at a split
test, what was varied, and what we might infer from our results.