A/B testing is often used to boost conversions. In fact, targeting and testing methods can increase conversion rates up to 300 percent. But it can do even more. A/B testing can have a momentous effect on your retention rates. Testing for what users really want vs. what you think they want is crucial, and nothing beats the thrill of meeting or exceeding their expectations.
There are many things to test that can improve retention rates, but here are five important areas every marketer should explore.
#1 A/B testing helps find the information that matters the most to users
When should you ask for email addresses, reviews or gently nudge customers into making a purchase? When should you ask users to watch an ad and gain free credits on your app? Through A/B testing, you’ll understand what your customers want, when they want it, and deliver increasingly effective communications. The team at German software training brand MedienReich, thought replacing ‘course categories’ with ‘bestselling courses’ on their homepage would boost engagement, so they ran an A/B test for 20 days.
During the 20-day period, the variation improved engagement by a whopping 40.87% at a confidence level of 99.9%. According to the test results by VWO: The variation won because it reduced the effort the visitor had to put in to finding MedienReich’s bestselling training products. And while they could have implemented the change automatically, the A/B testing gave the company further insights into their customers’ wants and needs.
#2 Discover which elements on your forms are most relevant to users
There are different types of opt-in boxes you can make available on a website. And you can always guess your way into a meaningful touch point with customers, but testing gives you a more accurate sense of direction. Testing for what users want to see on your forms (vs. what you think they want) can greatly improve your retention and conversion rates. For example, the team at Bionic Gloves came up with the idea to remove the discount code and special-offer boxes on their checkout page; their gut feeling was that doing so would increase conversions.
Before taking action, they decided to launch an A/B test — a page variation including incentives and the other without those incentives.
Their results? The “incentives” distracted users and sent them elsewhere. Once removed, their total revenue increased by 24.7% and Revenue Per Visitor by 17.1%.
You can release versions of your landing pages with different elements during different periods and see what happens. Or, create a landing page variation with one set of customers and another variation with a different segmented group. Yet another angle — monitor whether engagement is higher when the dashboard is present or absent. Compare your results and see which elements retain users and which convert them for a data-backed strategy.
#3 A/B testing increases conversions
What landing page design is best for your campaign? What copy do target customers want to see on your ad? How long should your landing page be? All these questions can be answered via guesswork or intuition. Or, you can run surveys and analyze the info. While many believe surveys can be as effective as A/B tests, experience shows that people answer one way on surveys and act another way in real life. As American cultural anthropologist Margaret Mead puts it: “What people say, what people do, and what people say they do, are entirely different things.”
Your best avenue to learning the true intentions of customers is to set up A and B variations of everything from your website copy and email campaigns to your search ads and apps. Take note of the responses you receive. Conversion optimization experts at ConversionVoodoo, for instance, tested three variations above the fold of the landing page on Kiva.org to see how the site might improve conversions:
Their three “above the fold” variations were:
- A few more explanations of how Kiva.org solves huge problems
- A 90-second video explaining how Kiva works
- A large slider to impact visual people who are influenced by graphics
They tested these three variations, got the final results, and settled on a variant with a conversion rate increase of +11.5% at over 95% confidence.
#4 Discover which trial periods work best via A/B testing
Your definition of long enough for your product trial period may be different from that of your users. Some companies do 3 days, some 7, others 14, 30, 60 and so on. It can become confusing if you’re not very experienced with product trials. A/B testing — trying one trial period version against the other — is your best option.
Hubspot’s homepage in 2010 offered users a trial for 7 days, but wanted to test whether a 30-day trial would increase subscriptions. Product manager at Hubspot Magdalena Georgieva said: “We were curious to see if offering a longer trial period would entice more visitors to sign up. Would it have a significant effect?”
Their results? The variation got a 110% increase in HubSpot free trials and a 99.9% confidence rate. A/B testing helped them increase their conversions by 110%.
#5 What notifications do users open? A/B testing can get you answers
Some notifications anger users and cause them to churn. In fact, over 50% of app users find push notifications annoying. There are some notifications that your would users love, and it’s the marketer’s job to discover what those are. Candy Crush sends you notifications their users specifically asked for. Run your own tests. What do your analytics say about the relationship between your notifications and user engagement/retention?
The value of testing confirmed
A/B testing is a valuable tool not only for conversions or revenue, but also retention. Test different versions to see what products or changes your customers want, and what interests them the least. And since the customer is always right, make their wishes come true by proving that you know them, you hear them, and you’re catering to their needs.