A/B Test: Did a Long or Short Headline Increase Opt-Ins by 38.23%?

795x447_Split_Test_Thumbnail_Blog

Scrapbooking takes the best of the past and carries it forward to enrich your life in the future. It can be highly creative, but it’s ultimately based in real, lived experience.

In other words, it’s a little like A/B testing. (You knew I was going to get there eventually, right?) You assess what’s worked in the past and apply your creativity to—hopefully—get great results going forward, all while keeping track of the data so it can go into your next project.

Sometimes that creativity results in a value-rich headline that clearly communicates why your product or service is so great. However, there’s a fine line between going into compelling details . . . and just going on and on. A lengthy, content-heavy headline may be too much for the visitor to handle, and can negatively affect your opt-in rate.

In this A/B test, a very value-rich but long headline squares off against a more concise headline. Did the longer version’s 9 extra words add more weight to the offer, or sink the conversion rate?

Which version do you think increased opt-ins for this LeadPage® by 38.23%? Go down to the comments and tell us which one you’d choose and why—then vote below to see if you were right!

Vote to reveal the winning A/B-tested LeadPage® and our analysis.

Vote: Which Page Won This Split Test?
1 What's Your Vote?
2 And the Winner Is...
50%
Longer, More Value-Rich Headline
Shorter, More Concise Headline
1 What's Your Vote?
2 And the Winner Is...
3 Free Guide
66%
Free Download: See Results from 20 of the Best Split Tests We've Featured on Our Podcast, ConversionCast (Called "The Split Testing Encyclopedia of Results").
It Contains Dozens of More Split Test Ideas, Results, and Insights.)
Click Here to Download my Free Guide

PRIVACY POLICY: We hate SPAM and promise to keep your email address safe.

YES! Nice call!
Nope! Try Again Next Time!
Winner: Version B created an overall increase of 38.23%
23% 77%

How People Voted

Click here to see our take on these results

If you chose Version B you are correct!

Version B, with a 99% probability of outperforming Version A, increased opt-ins by 38.23%.

Although we can’t say with total certainty why this change caused the increase, here are a few of my speculations:

  1. The specificity provided in Version B (9 PROVEN Tips) communicates more real-world value in fewer words.
  2. The lengthy headline in Version A overwhelmed visitors and brought down the page’s opt-in rate.

Why do you think Version B outperformed Version A? Let us know in the comments!

Not all visitors are the same, but A/B testing your headlines may be something to consider for your own LeadPages®.

Click here to get the free split-test guide

What Do You Think?

Did this test’s results surprise you? Why do you think Version B increased conversions so dramatically? Leave a comment below and let us know your thoughts.

If you’re new to LeadPages, you should know that all Pro and Advanced users can run any A/B test inside LeadPages in just five clicks.

Do you have a LeadPage® like this one that you would like to test? If so, you can set up the exact same type of test in under a minute. You can also A/B test your calls to action, text colors, images, form fields, and just about any other change you can think of.

Watch the quick video below for an introduction to enabling split testing on your LeadPages account.

https://youtube.com/watch?v=3h3pQKLagng?showinfo=0

  • Can we see the absolute numbers for this test? 99% significance level means nothing at all if the sample size was too small. You don’t want to educate your audience to think that statistical significance is a stopping rule for a test, right?:)

    • John Nye

      Hey Peep! This A/B test actually received 1,987 visitors and increased the conversion rate from 16.04% to 22.17%. I hope this helps!

    • Daniel Hollerung

      Peep, great point. Curious if you feel a minimum number of conversions is needed to determine if success was reached? I’m of the mindset that you need volume and action to call a winner. Personally, if opt-ins were less than 100 per version, I’d have a hard time feeling confident in scaling a winner.

      • John Nye

        Good question, Daniel! 100 unique viewers per version is a good number for pages that may not receive a whole lot of traffic. I usually like to use tests that receive upwards of 400 unique viewers per test, just to be safe.

        However, if your pages are pretty popular, there’s an added benefit to running the test for a week or two, and surpassing the 400 unique viewer benchmark. This strategy allows you to account for any viewer bias based on the day of the week and can provide you with a more accurate percentage lift in your conversion rates!

        • Daniel Hollerung

          That makes sense John. Just to be sure we’re on the same page, are you saying that a minimum of 400 visitors clicked the CTA from each test variant and opted in for the newsletter? That’s what I’m trying to get at.

          I know that we new a number of visitors (viewers) exposed to each version of the test but I also want to see engagement (conversions) to ensure the copy change had a direct impact on the results. Is that what you’re saying when you strive to get 400 viewers take action over the one to two week time frame?

          • John Nye

            I would say that 400 total, unique viewers is relatively enough traffic for most businesses. In terms of the number of conversions, 400 CTA clicks would be preferred (since, in this case, it would be a 100% conversion rate :P). However, for businesses that have a very low conversion rate, getting to 400 conversions may take months. In this situation, it may make more sense to stick with a 400 unique viewer benchmark because the level of impact should still be pretty clear within your data, as it is all based on an average and any significant change should be easily detected.

            But, let’s say that your site can get 400 unique viewers in 3 days. If this is true for you, running your test for at least a week or two would be more beneficial as traffic is not a factor, but bias based on the day of the week is definitely in play.

            Ultimately, if you want to determine a change’s direct impact on your conversions, it is important to test individual variables at a time so that your takeaways are more clear. So, when you want to test your headline, I would avoid changing any other element of the page. As a result, you will be able to determine the change’s direct impact on your conversion rate with ease.

            Does that make sense?

          • Daniel Hollerung

            I appreciate your take on this John, thanks for the reply.

      • This is science, not magic. There is no magic number like 100. You have to calculate the needed sample size in advance using a tool like https://www.optimizely.com/resources/sample-size-calculator/ or similar.

        In most cases a test that ran for less than 2 weeks (2 business cycles) and had less than 350 conversions PER variation is an uncooked test and the final outcome most likely random. But 350 is not a magic number either.

        Full explanation here: http://conversionxl.com/stopping-ab-tests-how-many-conversions-do-i-need/

        • Daniel Hollerung

          Agreed, thanks Peep. I calculate sample size with expected lift for my tests but am still cautious when a win is called with such a low number conversions.

          I’ll read your full explanation for more insight. Thanks again.