← Back to Blog

Got a Blog? Use These 5 Proven Split Test Winners to Boost Conversions (Case Study)

  • Conversion Rate Optimization

This post is twenty-one months in the making. Back in January 2017, I wanted to write about a/b split-testing ideas.

But after a few Google searches, I found many of the before and after images featured in the top posts were underwhelming, overused or in many cases, both.

So, the marketing team and I began running our own a/b split tests, mainly on our blog, to get data we could use for a future post.

That future post is the one you’re reading now. 

In this post, I’ll share my five favorite a/b split tests from twenty-one months of continuous testing. I’ll reveal our key learnings—including a few surprising findings—and give you concrete takeaways you can use to inform your future campaigns.

Free Downloadable Bonus

Want More Conversion Rate Optimization Strategies?

Get access to our free CRO toolkit and skyrocket your organic traffic, on-page conversion rate and more (includes resources not found in the blog post).

Experiment 1. Radio Buttons Vs. Drop-Down

Like many companies that rely on email marketing to engage potential buyers, we’re eager to segment new subscribers, particularly on our blog. 

To enrich our lead data, we need to know as much as possible about new subscribers—their company size, annual revenue, number of employees, and more. The more data we have, the easier it is to lead score and prioritize outreach, accordingly. 

What’s crucial, in the beginning, though, is the subscriber’s industry. With many of our customers working in e-commerce, we know that a subscriber that belongs to that industry is more likely to become a customer than say, a beginner to marketing (which is one of our segmentation options).

A preview of how we’re scoring leads today.

When we first began segmenting subscribers, we invited them to click a link in our welcome email as many marketers do. But with few subscribers segmenting themselves, we ended up with a hodgepodge of segmented and unsegmented subscribers. 

To combat that, and to get more subscribers segmenting themselves, we asked visitors to segment themselves before they opted in, via an email popup. We feared that adding another input field might hurt our conversion rate, but we also knew that it would improve our lead quality. 

We eventually added a segmentation option to our popup and noticed an improvement in the number of visitors segmenting themselves…

But we knew we could do better. 

One hypothesis we wanted to test was whether the method visitors used to segment themselves affected conversions. Specifically, was having to choose an option from a drop-down causing friction and affecting conversions? 

To test our hypothesis, we tested our blog popup with a drop-down (our control) against one with radio buttons.

After running the test for two weeks, the winner was clear:

Website visitors were 41.18 percent more likely to segment themselves when opting in through a popup with radio buttons than one with a drop-down option.

Our hypothesis was correct. And like many of the results you’ll read below, informed how we’re collecting leads on our blog today.

Key Learning

If you’re not getting results from segmenting new subscribers with email, invite website visitors to segment themselves through a website popup when opting in, instead. In our experience, it’s the best chance to get more enriched lead data.

Experiment 2. 30-Second Time Delay Vs. 30% Scroll Trigger

One rewarding aspect of tested advertising methods is that it trumps opinion.

You might feel like changing a headline, a call-to-action (CTA), or even a word will make a difference to a campaign’s performance. You might even have a good reason to believe so. But if you test it and it bombs, the proof is in the pudding.

Popups, and in particular, triggers, are no exception.

When creating a new campaign, knowing where to place it is easy. If you’re driving traffic to, say, a product page, it makes perfect sense to test a campaign there. 

But deciding when to show it is challenging without data to use as a starting point.  

In our case, we knew we wanted to show a campaign on our blog as it gets the majority of our traffic. But we didn’t know when to show it. In the beginning, we went into Google Analytics, looked at our average on-page time, and set a campaign to show before the average user exited our site. It wasn’t ideal, but it was enough to get started. 

Over time, though, as we improved as marketers, and got better at making data-driven decisions, we wanted to base when to show our popup on real-world testing.

Our next experiment, then, involved testing a popup with a 30-second time delay (control) against one that triggered when a visitor scrolled 30 percent of a blog post. (Note: we chose 30 percent after running a heatmap on our blog and seeing 30 percent was the average drop off.)

After another two weeks of testing, we had another piece of the puzzle:

The slide-in with a scroll trigger outperformed the one with a time delay by 61.83 percent.

Bottom line?

When it comes to choosing a trigger, test two against each other to determine which one performs better. Data trumps opinion, remember. Always.

Key Learning

If you’re running popup campaigns on your website, consider the trigger you’re using. If your current choice is based on anything other than findings from an experiment, it might be time to run a split test to determine the optimal time.

Editor’s Note

If you don’t have time to run a/b split tests, check out Smart Triggers from Sleeknote. Our signature technology continuously tests—and optimizes—the best trigger for your campaigns over time. Learn more about it here.

Sam Thomas Davies
Head of Content

Experiment 3. Desktop Teaser Vs. No Desktop Teaser

One of the ways we distinguish ourselves from countless copycat competitors is our teaser feature.

A teaser, if you’re unfamiliar, is a preview of a popup’s content, often found in the bottom left- or right-hand part of the user’s screen.

Its job, when done right, is to make visitors curious enough to click through and opt in through the popup.

(Yes, that’s an emoji in the teaser. And, yes, we tested that too.)

After our previous test, we knew a 30 percent scroll trigger was better than a 30-second time delay. But we wanted to see if adding a teaser drove even more conversions.

To our surprise, it did:

Adding a teaser to our blog slide-in increased email sign ups by 81.83 percent. 

While it’s hard to determine the reason it performed better, our belief is two-fold:

  • It created an information gap. Classic research by Russell Golman and George Loewenstein suggests that when we feel a gap between what we know and what we want to know, our curiosity drives our need to seek out new knowledge. When you read, “We’ve Got Something for You…,” you can’t help but click through to learn more.   
  • It uses a two-step optin. Playing on the foot-in-the-door technique and the commitment and consistency principle, visitors are more likely to opt in if they click a teaser. Why? Because they’ve already taken action. If anything, NOT opting it after doing so requires MORE effort. 

Increasing conversions from visitor to subscriber doesn’t have to take long. As you can see, it’s sometimes as quick as enabling a teaser.

Key Learnings

If you’re using a slide-in campaign on your website, consider adding a teaser to increase conversions.

Experiment 4. Minimalist Design Vs. Branded Design

“Place your bets, now!”

That’s how it often starts.

Our Head of Growth, Kristian, will post two campaign variations in our marketing Slack channel before inviting the team to predict which will win in a split test.

For one test, in particular, the voting was almost unanimous—nearly everyone chose the custom-designed, branded popup. And it makes sense: you would assume that a campaign that’s easier on the eyes would perform better. But as you know by now, that’s not always the case.

“Will a custom-designed, branded popup outperform its basic, minimalistic counterpart?” That was the question we wanted to answer in our next test.

Before revealing the results, take a moment to consider which you’re more likely to engage with?

Our (seemingly unbeatable) control, built using our drag-and-drop editor?

Or, our experimental challenger, made by our in-house designer, Damien?

If you choose the former, you’re not alone:

Our minimalistic design popup increased email sign-ups by 137.25 percent.

Findings like the above might surprise design-savvy marketers (and relieve the inexperienced.) But rest assured, if you’re designing a campaign, remember, despite your best intentions, a campaign that’s easy on the eyes won’t always lighten a buyer’s wallet.

Key Learning

Contrary to popular belief, you don’t need a designer to create a high-converting website popup. Oftentimes, a basic campaign built with a drag-and-drop editor is more than enough to drive high conversions.

Experiment 5. Description of Offer with Bullets vs. Image of Offer

It’s a common marketing practice to include an image in a popup, especially if you’re offering a freebie. And it makes sense: if, as a website visitor, you’re asked to opt in for a lead magnet, you want to know what you’re getting, right?  

We thought so, too. But after several previous tests, we were unsure why our control was performing so well, despite showing what our visitors would receive in exchange for their email. After consideration, we realized the bullets, describing the offer, played a bigger role than we initially thought.

To ascertain that for certain, we ran an experiment testing it against a similar design with an image, instead, previewing the offer:

After 49-day testing, the results were clear:

A popup describing the offer with bullet points boosted our conversion rate by 167.21 percent. 

Further, the winning campaign accounted for 72.8 percent of ALL our blog leads within that period.

“A picture’s worth a thousand words,” says the old English adage. But in our experience, good, compelling bullets are worth far more. 

Key Learning

Avoid writing off good copywriting in a popup campaign. It’s often THE differentiating factor in a winning campaign. To learn how to write better copy, read my articles on persuasive writing techniques and good copywriting examples.

Free Downloadable Bonus

Want More Conversion Rate Optimization Strategies?

Get access to our free CRO toolkit and skyrocket your organic traffic, on-page conversion rate and more (includes resources not found in the blog post).

Conclusion

I have a confession to make:

If you joined our newsletter while reading this post, you’re part of a new a/b split test we’re running.

We’re currently testing whether a branded design with bullets will beat our control…

But you’ll have to return to this post when we have the results in a few weeks. 😉

Now I want to hear from you:

Which a/b split testing idea will you try first? Leave a comment below. 

Like what you read? Leave a comment