Blog

Mobile App A/B Testing: Everything You Need to Know

Jul 20, 2019

Data is now used to influence just about every aspect of our lives. And so it is a no-brainer to use it when trying to figure out which version of an app’s UX or marketing creative works best. This latest chapter in our product spotlight series focuses on A/B testing: what it is, why it should be done, how to get the best out of it, and finally some use cases that have brought success to global brands.

What is A/B Testing?

A/B testing compares two (or more) variants of a mobile app’s UX or content. Audiences are split into random groups, and each group receives a different variant. The most successful variant is then determined by analyzing response data according to agreed criteria.

Why A/B Test?

It’s simple: let the data decide the right approach. The alternative is making decisions based on ‘gut instinct’, or the opinion of the most senior or most persuasive person in the room. But making decisions based on cold, hard data takes away the possible ambiguities, making sure that you’re in a better position to deliver better customer experiences and marketing.

The Different Types of A/B Testing

There are two main types of A/B testing. Both are easy to set up, deploy, monitor, and analyze in the MessageGears dashboard:

UX A/B Testing: Test specific aspects of your app’s UX to find out how users respond to variants of the in-app experience. UX A/B testing not only helps your users receive the best possible experience in your app, but also helps you optimize your app’s performance according to your KPIs. This could be anything from navigation structure to how you construct your app’s purchase flow, or the difficulty of a particular level in a game to the price of an in-app purchase.

Content A/B Testing: The other type of content A/B testing, allowing brands to tailor their marketing messaging to really resonate with their users. This can be anything from experimenting with images in an in-app message or rich push notification, to testing different CTAs to see which word combination delivers the most conversions, to measuring the impact of adding emojis.

What to Test?

It can be all too tempting to test absolutely everything. We think it’s better to be strategic and narrow down tests to what really matters when it comes down to the nitty gritty. Otherwise it’s just a fishing expedition that will provide insubstantial results. Decide what metrics you want to measure or influence before you start planning the content of the test itself. Engagement, for example, can mean a lot of things: time in-app, more frequent sessions, interaction with content, social sharing etc. Think about what will constitute success for your test and set goals for uplift.

Examples of Successful A/B Testing

With the basics behind us, let’s take a look at ways that A/B testing has been deployed successfully in mobile apps:

Use Case 1: Testing Navigation UX to Increase Engagement

What kind of menu system to use causes more arguments amongst teams than we’d like to think about. And for good reason, it’s an incredibly important thing to get right. It controls which content is displayed and the way users are able to move and browse through the experience. If it’s not what users want, then your app will suffer.

A/B testing can step in to solve questions like: what’s the best UX: hamburger navigation or traditional menu; what order should items be displayed in the menu; and where should specific features appear? All of these have a significant impact on the performance of the app, and A/B testing alternative approaches helps to optimize and iterate your app rapidly for success.

One MessageGears customer, a major entertainment brand, wanted to optimize their navigation structure. They were considering replacing their traditional tabbed nav with a hamburger nav across all devices. It felt like the right thing to do, and a lot of expert advice confirmed it was time to change. They ran an A/B test before committing entirely to the project and discovered that engagement rates fell 50% with the new hamburger nav. A lucky escape, thanks to A/B testing.

Use Case 2:  A/B Testing to Maximize Revenue

One MessageGears customer, a world-renowned publisher, was struggling to deliver the revenues they desired from their app. Instead of doing something drastic, like reducing the price or changing strategy entirely, they decided to A/B test their paywall (the number of articles a user can read before having to subscribe) to see how changing that number made a difference to the number of subscriptions they were receiving on mobile.

The test was simple, but well defined and carefully thought out. The normal paywall allowed readers a limit of 8 articles before they had to subscribe to continue reading. For the purposes of the A/B test, the audience was split 50/50 and delivered an alternative paywall limiting readers to just 4 free articles. The two variants were delivered to over 190,000 participants each. It was a brave move that paid off handsomely.

The results of the test were surprising: readers were 27% more likely to convert with a 4 article paywall. Further analysis showed that, on average, the users engaging with the 4 article limit brought in 20% more revenue over the month that the campaign was running.

And Finally…Iterate

Testing is not a one-and-done process: you plan, you run the tests, you analyze the results, and then do it again. You have to retest any assumptions that you derive from the results. And you need to continue to refine based on the results you have seen. This is a process that demands long-term commitment.

To learn more about how MessageGears helps global brands deliver A/B testing across mobile, web, email, OTT, or SMS, please schedule a demo here.

About the Author

Will Devlin

A 20-year email marketing veteran, Will has focused on marketing strategy and execution for MessageGears since 2014. He has extensive experience on both the retail customer and service side of email marketing, and he’s interested in helping businesses better understand how they can make the most of the work they put into their email campaigns.