A/B testing is a critically important tool for improving long-term growth and sustainable revenue for an app. It helps you understand how your users are converting, and provides insights for a few key pieces of information:
- What is attracting new users.
- How users operate within your app.
- What will continue resonating with them to keep them engaged.
So how can you implement this important optimization technique? Read on to learn proven tips and strategies for A/B testing to help accelerate growth and retention, and how you can create your own A/B test in AppLovin MAX.
What is A/B Testing?
A/B testing allows you to divide an audience into two (or more) groups, change a variable within the content flow being served to those groups, and observe how the change affects the test group.
Armed with data, you can then make data-based decisions to improve your users’ experience, help acquire new users, and better monetize your app.
Different Types of A/B testing
There are two primary places A/B testing occurs:
- Marketing campaigns, which includes ASO (App Store Optimization) and user acquisition strategy.
- In-app, where you test UX/UI, onboarding, and other elements while monitoring things like session time, retention, engagement, and any other app-specific behaviors you want to watch.
MAX provides tools for in-app testing to help you optimize revenue from ads, so you can test things such as:
- ARPDAU increases with frequency cap changes, refresh rates, and bid floors
- Waterfall optimization
- Network additions/removals
A/B testing examples to get you started
User acquisition / App Store Optimization (ASO)
A/B testing can help you better serve and monetize existing users as well as determine how to acquire new users by testing different app store strategies. This includes all of the different elements — images, screenshots, descriptive text, etc. — that comprise your app’s page in an app store. This is a good place to test different elements to help you dial in what gets users to install your app.
- In-App testing – Within your app itself, you can use A/B testing to test a variety of elements.
- Design – Is your CTA (Call-to-Action) button placed in the best location? Does it get clicked more on the left or the right side? Do different colors attract more clicks?
- Player Engagement – Does showing competitive leaderboards increase player engagement? A/B testing can provide insight into engagement.
- Monetization – Try different banners or test different in-app purchases to find out which receive the most clicks.
Where should I focus my testing?
With so many options and potential elements to test, what are the most important elements to look at? Here are some common ones:
Retention Rate: According to these reports on iOS and Android app retention, average day-30 retention rates come out to 4.13% and 2.6%, respectively. Churn is all too common, but even though those numbers seem grim, they more importantly represent which percentage of apps are doing it right.
After someone has downloaded your app, you could test any of the following to help figure out what keeps your users coming back:
- Promotional content, sales, and upgrade offers
- Specific product features (what’s your app’s “killer” feature that users love?)
- Different types of in-app purchases, user rewards, and more
Monetization. Whether you monetize with in-app ads, in-app purchases, or both, you can run A/B tests to optimize your monetization strategy. For example:
- Test ad configurations. You can experiment with different ad monetization strategies — such as adding or removing a new network — to see incrementality. Or, test things such as a new price point or regional waterfall optimizations to help identify opportunities to maximize LTV.
- Test factors that might increase IAPs. A/B testing allows you to try different customer engagement methods and messaging to see what works best. Try different messages and message lengths, keep an eye on your CTA, and you’ll learn what truly resonates with users.
Onboarding. A streamlined, easy-to-understand, and painless onboarding process is critical, and A/B testing can help you understand where your users might be struggling and where they are successful so you can improve the experience for them. Figure out what your users love and they’ll keep coming back. Here are some quick A/B test ideas you might use to help improve your onboarding:
- Test different signup timeframes. Do users have to immediately sign in or do you give them some breathing room before asking? Giving them time to decide gives you more time to showcase your app, and that pacing could be viewed as less intrusive by your users.
- Test different onboarding forms. After a user signs up, is the process of filling in any forms as quick and easy as it can be? What type of login is required (social media, email, etc.)? Complex or confusing onboarding processes can lose users. Consider testing different forms and flows to see what works best.
- Test different onboarding flows. Show users how many screens or steps are remaining in the onboarding experience so they understand when it will end. You could test different design layouts, including displaying the number of screens that are left vs. dots completion bar, for example.
- Test different messaging. A/B testing allows you to try different messaging and see what works best so you can better understand player psychology and motivations and fine-tune your messaging. Keep text clear, concise, informative, and easy to read. Use short, action-oriented words to connect with your audience, such as: easy, simple, free, love, new, proven, and save. Try different messages and message lengths, keep an eye on your CTA, and you’ll learn what truly resonates with users.
How long should I test?
Generally, 1-4 weeks is probably enough time to get enough meaningful data, with an upper maximum of around 4 weeks. Why 4 weeks? Because A/B testing is not a strictly controlled environment and many sudden, unexpected factors (sudden outages, unexpected trends, etc.) could potentially affect your data. It’s important to allow enough time to collect data, but not so much that you run the risk of polluting it due to unexpected external factors.
How many users should I test?
According to Nielsen Norman Group, statistical significance is the probability that an observed result could have occurred randomly without an underlying cause. This number should be smaller than 5% to consider a finding significant. For example, if you test 2 colors for buttons A and B, track the clicks (conversion rate) for each, and find button B’s conversion rate to be significantly higher, then you can be 95% certain that the conversion rate for all your users will be higher using button B.
How do I create an A/B Test in MAX?
If you’ve never created an A/B test before in MAX, here’s how:
- From the MAX dashboard, in the left navigation column, select Mediation > Manage > Ad Units.
- Click the ad unit for which you would like to set up an A/B test.
- In the Edit Ad Unit window, in the Default Waterfall tab, open the ⋯ menu and then select Create AB Test.
- Select the Copy existing ad unit configuration… check the box and then click Create AB Test. This will copy your existing waterfall to your new test.
Note: MAX copies your existing waterfall structure into the test structure you create. You only need to make those changes that you want to test in your new Test Ad Unit. You do not have to build out a completely new waterfall from scratch. - When you create a new A/B test, by default MAX applies the Test configuration to 50% of your users — but you can change this value.
Congratulations! You’ve created an A/B test. Now what?
Now that you have an A/B test to work with, what should you begin testing, and what are some good testing strategies?
- Define what you want to test. You can test and analyze nearly anything, but it’s a good idea to start with an hypothesis.
- Test one change at a time. While you can make multiple changes (called multivariant testing) in an A/B test, it’s best to start with a single change. This will make it easier to zero in on something specific that is improving (or harming) results.
Get started with your A/B testing to engage and retain users with MAX.