How to Run App A/B Tests and Get Results

If you have a successful app, you’ve probably already thought about redesigning it. You may have defined goals for a redesign, such as improving the UX or moving to modern technology. But how can you know that people will actually like your changes and that they’ll be beneficial for your app? The answer is A/B testing.

What is A/B testing?

A/B testing (or split testing) lets you compare two or more variations of the same feature or interface element to discover which is more appreciated by users. Split testing also helps you build your makreting strategy as you get to understand if it’s worth implementing a new design or feature.

AB_testing_scheme

Example 1. You’re planning to change your order button
Analytics reports and real-time statistics tell you that users often abandon shopping carts in your app. You think users miss your order button because it’s too small. To test your hypothesis, you set up an A/B test with two shopping cart variations: one with a bigger button and one with the existing, smaller button. Next, you define a segment of 12,000 users and show 50% the current button and 50% the new button. You find that 34% of users tap the bigger button, compared to only 5% who tap the smaller button. So you make the button bigger for all users and get more conversions.

Example 2. Your designers want to change the colors of your app’s UI
While everyone else is adopting minimalism, your app’s UI is screaming with bright orange, green, and yellow. Your designers suggest switching to black and white with mellow blue accents. You believe it’s not a good idea, so you split test their alternative with the current color scheme using a segment of 2,500 users. The results prove you wrong, so you implement the more subdued colors for all users and get positive feedback. Busted! Now your designers are pushing you to change the overall UI!

Example 3. You’re rebranding your app
Users keep saying your app is good but it looks like it’s stuck in 2008. And you know it’s time for a big redesign for your whole brand. The logo, colors, icons, and all UI elements have to be changed. You choose to move gradually, so you first split test new logo alternatives and screenshots with an updated UI on your App Store page. Then you start surveying users and find out the logo is good, but you still need to work on the interface and deliver app screens that are brighter and less crowded.

Example 4. You want to implement new technology in your app
You’re planning to implement an AI-based recommendation algorithm for your video streaming app to help users discover new content. So you run an A/B test with two recommendation engines: the old engine shows new content only from channels users are subscribed to, while the new one contains AI recommendations. The results show that users tap more on the AI-recommended videos, so you make the feature available for everyone.

Example 5. You’re changing the tone of voice in your app
You’ve decided to add a design persona – a plump hamster in a sports outfit – to deliver a new tone of voice in your nutrition app. But you’re a bit afraid that users won’t accept him, so you run a split test: 1,500 users get nutrition advice from the hamster and another 1,500 users see pop-ups with no character (as before). The hamster gets 67% more users to open the full advice description, so you decide that the plump little guy is doing a great job for you.

What are the benefits of web and mobile app A/B testing?

The goal of A/B testing is to get rid of uncertainty when making important decisions about your app. Marketers and business owners often use A/B tests in their mobile apps and web software to try out ideas they have come up with on their own or to test ideas based on data from reports. With A/B testing, you can check if your idea is viable and see how users react. Here are some more benefits of split testing:

  • Handle risks and save money as you test updates before implementing them permanently and for everyone

  • Improve your main business KPIs (we’ll tell you about what metrics you can use later in the article)

  • Get insights to improve your app’s user interface, user experience, and content

  • Discover actual user behavior and responses to new designs and features

How do you conduct A/B testing for mobile apps and web software to get meaningful results?

You can A/B test either in your app or on your app store page (learn more about how to optimize your app store page from our guide). In this post, we’ll focus on A/B testing in-app content.

Analyze your app data before split testing

Google, Facebook, and Netflix all do A/B testing regularly and only with enough data to get reliable results. You need metrics to help you measure the success of your A/B tests:

  • Conversion rate — The percentage of users that complete a goal (make a purchase, subscribe, etc.) in your app

  • Retention rate — The percentage of customers that keep purchasing goods or paying for a subscription in your app; typically calculated as an average over a period of six months or a year

  • Churn rate — The percentage of users who stop using your services or buying within your app over a given period 

  • Active users — The number of unique users who use your app daily (daily active users) or monthly (monthly active users)

  • Daily sessions — The number of sessions you get from daily users in a single day; shows you if users open your app as often as you want them to 

Checking the average metrics for your industry can give you a good starting point. But in the longer run, it’s better to develop your own benchmarks for success depending on your business specifics. This way, you’ll be able to generate more relevant ideas on how to improve your app and will get more trustworthy results. Check out our guide to app analytics if you want more insights into how to improve your app’s performance and ROI.

Follow the A/B testing flow

While collecting valuable data for A/B testing for mobile apps and web apps, you may discover you have to redesign and update multiple parts of your app – and that’s fine. Probably, your temptation will be to start testing them all, but we recommend you don’t. If you A/B test multiple elements simultaneously, you risk not understanding which of them drove the desirable results. That’s why it’s better to split test gradually, accurately measuring the influence of every single update. Now let’s have a look how to create an A/B test for a mobile application or a web app.

Build a priority list
Prioritize elements to test depending on the potential benefit you can get from them and based on the significance of the problem they solve. For example, if your onboarding screen is confusing and you also need to change the button colors, you should probably start with the onboarding, as it may cause users difficulties at the beginning of their journey. Uber has even developed an experimentation platform to identify opportunities for improving its app and to run A/B experiments on them.

State your hypothesis
After you know what to split test, you need an idea of what options to test to get the desired results. This idea is called your hypothesis. You can test absolutely any changes for the sake of, say, improving metrics and creating a better user experience. And you need a goal for these changes – say, boosting your retention rate to 35%. 

Take Netflix as an example: They hypothesized that they needed to recommend content to watch within 90 seconds or else users would get bored and leave. For A/B testing, Netflix used big data and an AI algorithm to find what people with similar content preferences watched and provide relevant recommendations to users. 

Choose your segment and duration
To ensure your split test results are accurate, you’ll need to calculate the number of users and the duration of the test. Don’t worry; you don’t have to do the calculation yourself, as there are plenty of free sample size calculators on the internet. For example, in the app by Unbounce, you only need to enter your current and desired conversion rates, number of variations, and number of daily users. After providing this information, you’ll see a recommended sample of users and test duration.

Sample_size_calculator

Create variables 
You’ll need content to test your hypothesis, and you can create more than one variation for a split test. Google once tested 41 shades of blue to find the best fit for Gmail, so don’t hesitate to make as many variations as you need. But don’t get carried away, as more variations require more users. See for yourself: to get a 20% conversion rate, you’ll need 7,200 users (as the sample size calculator states), but with 4 variations you’ll need twice as many people. And if you don’t have this many users, you can’t be 100% confident about your test results.

Test multiple variables at the same time
If you test variables for the same part of your app at different times, you risk getting insignificant results. For instance, your online shopping app may get many more purchases in December because of Christmas. And if you test, say, one variation of your product layout in November and the second in December, you’re likely to misinterpret the impact of your A/B test on sales. The exception is for split testing timing, when, say, you test the proper time to send push notifications to your users, as Netflix did. 

Ask for user feedback
We recommend backing your analytics data with user feedback. Otherwise, along with better metrics, you may get plenty of angry users, like Snapchat did after its update in 2018. Instead of collecting feedback from users that tried the new app design (and probably hated it), Snapchat released the new app version for all users. In a short time, over 600,000 people had signed a petition for Snapchat to revert to the older version, and millions of users simply left the app and never went back.

Report A/B test results and repeat
Once you finish the A/B test, report the results to your team and users so they know why you’re updating your app. These reports can inspire new hypotheses for future tests, so you can conduct a test retrospective and organize regular meetings to generate improvements for your app. If possible, build a testing roadmap with detailed descriptions and then repeat the A/B test flow all over again.

AB_test_flow

Preparing for and conducting split testing can be challenging. But consider A/B testing a vital part of every little update you’re planning and you’ll avoid ungrounded decisions and save a lot of money as well as nerves. Hopefully, our guide to mobile and web app A/B testing will help you with that!

4.3/ 5.0
Article rating
12
Reviews
Remember those Facebook reactions? Well, we aren't Facebook but we love reactions too. They can give us valuable insights on how to improve what we're doing. Would you tell us how you feel about this article?
Looking to improve your app?

We can help!

Check our services

We use cookies to personalize our service and to improve your experience on the website and its subdomains. We also use this information for analytics.

More info