A/B Test: What Kind of Guarantee Boosted the Opt-in Rate 48.32%?

795x447_Split_Test_Thumbnail_Blog

Advertising a money-back guarantee can definitely convince hesitant customers to open their wallets. It makes intuitive sense, but there’s also some data behind this. For instance, a 2011 study in the Journal of Retailing found that money-back guarantees “evoke a positive emotional response, thereby increasing consumers’ purchase intentions and willingness to pay a price premium.”

But can you increase your revenue simply by changing the way you present that guarantee?

And beyond that: can you drive additional engagement even before the purchase stage by mentioning the guarantee?

Almost certainly. In today’s featured A/B test, one LeadBox™ for a coupon outperformed the other by 48.32% when it framed its money-back guarantee differently.

Version A invited customers to “Put it to the test risk free. You have nothing to lose but your competition.” Version B was more straightforward: “Guaranteed to improve performance or your money back.”

Which LeadBox™ do you think increased the opt-in rate by a relative 48.32%?

Note one complicating factor before you guess: the headline color also changed between variations (red in Version A, blue in Version B), so consider whether this could have an impact, too.

Go down to the comments and tell us which one you’d choose and why—then vote below to see if you were right!

Vote: Which Page Won This Split Test?
1 What's Your Vote?
2 And the Winner Is...
50%
Red headline, "risk free" guarantee
Blue headline, "money back" guarantee
1 What's Your Vote?
2 And the Winner Is...
3 Free Guide
66%
Download Our Free Split Testing Report (Called "The Split Testing Encyclopedia of Results").
It Contains Dozens of More Split Test Ideas, Results, and Insights.
Click Here to Download my Free Guide

PRIVACY POLICY: We hate SPAM and promise to keep your email address safe.

YES! Nice call!
Nope! Try Again Next Time!
Winner: Version B created an overall increase of 48.32%
49% 51%

How People Voted

Click here to see our take on these results

If you chose Version B, you are correct.

Version B, with a 96.07% probability of outperforming Version A, produced a relative 48.32% conversion-rate lift.

Although we can’t say with certainty what was behind this increase, here are a few of my speculations:

1. Although cleverly worded, the guarantee language in Version A was potentially less clear than in Version B. It’s likely that customers who are motivated by a guarantee will be even more comfortable purchasing the clearer the terms of that guarantee are.

2. Version A emphasized the idea of risk, whereas Version B led with a promise to improve performance.

3. Version A’s headline was red instead of blue, which may have caused visitors to subconsciously view it as a warning message rather than an offer.

Not all customers are the same, but consider testing how you frame your guarantee the next time you offer one.

leadmagnet-button-feature4

What Do You Think?

Did this test’s results surprise you? Why do you think Version B increased conversions so dramatically? Leave a comment below and let us know your thoughts.

If you’re new to LeadPages, you should know that all Pro and Advanced users can run any A/B test inside LeadPages in just five clicks.

Do you have a LeadPage like this one that you would like to test? If so, you can set up the exact same type of test in under a minute. You can also A/B test your headlines, body copy, calls to action, and just about any other change you can think of.

Watch the quick video below for an introduction to enabling split testing on your LeadPages account.

https://youtube.com/watch?v=3h3pQKLagng?showinfo=0

  • I’ve been reading over and over again about this A/B Test, but nobody seems to have given me a clear explanation of what the heck it is. I think (but am not sure) it means putting out two alternatives of one subject and seeing which performs best. How you’d determine the “winner” is again a mystery. Above you are asking us to choose one, which is fine. But how do I know which of my own two alternatives is best? Please consider doing a post on just an explanation of this subject.

    • Daphne Sidor

      Yes, I think you have the right idea! Great questions. I’ll try to answer them as clearly as I can.

      A/B testing (also called split testing) is a way to see which of two variants of a web page (or other web asset) performs best. To determine a winner, you need to decide how you’ll measure success and find a way to track it. In this case, we’re using the opt-in rate as our success metric.

      Note that the poll in this post isn’t actually *determining* which version won the split test—it’s more of a fun way to test your predictive skills. Plus, it’s interesting to see whether how people voted matches up with the actual results. Sometimes the answer is intuitive, sometimes not. (Which is why it’s important to split test in the first place.)

      Most people will use some kind of software to run their split tests, since it’s otherwise generally difficult to accurately track the kind of data you’d need to determine a winner. LeadPages Pro and Advanced members get built-in split testing for their landing pages and LeadBoxes. You can set this up with just a few clicks, and LeadPages will automatically track your opt-in rate for both versions and declare a winner once a statistically significant difference in opt-in rate appears between the two versions. (If you are a Pro or Advanced LeadPages member, our Support team will be happy to help you set up your first A/B test.) There are also standalone split-testing services available, though many of them are a little pricey for smaller businesses.

      For more on how to run a split test, you might want to check out this post: https://blog.leadpages.net/4-ways-optimize-split-testing-strategy/

      Feel free to let me know if you have any followup questions!

  • VeloNomad

    The download report link isn’t working. Corporate firewall typically doesn’t block these pop ups – it just refocuses browser window to top.

    • Daphne Sidor

      Thanks for the heads up, Velo—I’ll get that link inside the widget fixed ASAP. In the meantime, the big green button below should be working just fine.