All mobile marketers today understand how dangerous it is to underestimate the role of app store product page visuals for conversion rate optimization. Elaborate app or mobile game icon, screenshots and app previews are now no less important than an optimal keyword set. 

To make data-driven decisions, in terms of choosing the right creatives for your app store listing, you need to run mobile A/B testing experiments. If you are looking for best practices in this field, read this following ultimate guide on how to do app store A/B testing to gain the best results and improve the conversion rate.

Look around for ideas

Start with research: look around and try to get ideas from the market trends, your app category and competitors. Here is a short checklist of what you should do as an initial step:

  • study current design trends;
  • check out the latest industry trends;
  • find out best practices for your app or game category;
  • examine page listings of your competitors;
  • analyze your own app store product page.

Formulate a hypothesis

Once you’ve carried out a research and found some ideas, it’s time to formulate a hypothesis. For example, you decide that screenshots with a green background might perform better than the current ones with a white background. Or that you should add a caption with a call-to-action to your first screenshot. Or else, you would like to change your app’s icon on the App Store but first you have to make sure that it will perform better than the current one. So, you need to check a hypothesis through a mobile A/B testing platform. 

Choose elements for testing

Decide what product page elements you are going to start with. I’d recommend you to start A/B testing for your mobile game or app with screenshots, icon, app previews, and then move on to the title, subtitle and descriptions. 

Tip: Avoid A/B testing several elements & changes at once

If you make several changes: change the screenshots background color, choose another icon, swap the first & the third screenshots – and compare the performance of this new variation to the current one, this makes no sense. Since you are virtually testing several hypotheses at once, if there is a winner, you won’t know which one of them is correct and what exactly affected the result. So, make only one change at a time and test only one element: for example, change a game character on your icon and see how it affects the conversion rate. 

Design variations

Once you have a hypothesis and know what product page elements you’re going to test, inform your colleagues from the design team and ask them to develop several variations: a few versions of icons or screenshots that have significant differences. 

Note: it’s a good idea to adjust creatives to the current trends, holidays and seasonal events. 

Tip 1: When designing new visuals, stick to the principles of Visual Salience.

In short:

  • choose contrasting colors for your icon and screenshots: colors that will make your creatives differ from competitor apps
  • add a salient item to your icon or screenshot
  • highlight areas with CTAs on your screenshots 
  • place tilted elements on your icon or screenshots
  • create landscape screenshots instead of or in addition to portrait ones
  • use objects of various sizes as part of your screenshots design.

Learn more about the principles of Visual Salience here.

Tip 2: Avoid testing too small changes in design

I’ve mentioned that variations of icons and screenshots should have significant differences. So this is another important point for the right app store A/B testing.

Users just do not notice tiny changes in design, so if your variations look very similar, they will most likely produce the same result. To make sure they are distinct enough, show several variations to your colleagues and ask if they see any difference.

Choose traffic source 

You will need to drive paid traffic to your mobile A/B testing landing pages, so set up a traffic source for your experiment. We at SplitMetrics recommend you to use Facebook for this purpose but you can choose any other traffic source: website, ad networks, in-app cross-promo and more. 

Once you have decided on the traffic source:

  • manage audience settings to target your Ideal Customer Profile; 
  • develop a web banner.

Tip: Your banner ads shouldn’t have the same design as one of your app store creatives

Scientific research shows that if people see some objects that are already familiar to them, their brain processes information faster, which results in a feeling of liking. So, if your banner ads and one of the variations have the same design, users will prefer that variation. That’s why to obtain unbiased results, you need to develop a web banner that would differ from the variations you’re going to A/B test. 

Run a test

Now you have a hypothesis and several variations of the element you would like to test. You have chosen the traffic source and defined the target audience. It’s time to run an A/B test on a mobile A/B testing platform. If you prefer sequential A/B testing, which SplitMetrics platform provides, here is a list of steps at this stage:

  • сhoose the experiment type;
  • upload visuals and set up your landing pages;
  • specify the baseline conversion rate (conversion rate of your current variation), statistical power (80% by default), significance level (5% by default) and the Minimum Detectable Effect (MDE) to determine the sample size. 

Then run a test. App store A/B testing platform will analyze the way users interacted with your app page and determine the winning variation.

Tip 1: Don’t stop an experiment before you get the right amount of traffic

If you prefer classic A/B testing, don’t finish an experiment before the necessary amount of traffic (sample size) is reached, or you might get statistically unreliable results. The required amount of traffic should be reached for both A and B variations. 

However, in case you choose sequential A/B Testing, you will be able to check the results at any time. 

Tip 2: Don’t stop an experiment until a 7 day period is over

It’s no secret that different apps and games experience bursts in activity on different days during the week. So, to get reliable results, you need to capture peak days of your app during an app store A/B testing experiment and not finish it before the week-long cycle has passed.

Again, this is important in case you are an adherent of classic mobile A/B testing.

Analyze the results

You have the winner! But you also have valuable data: the time users spent on your app’s page, how they interacted with it, what elements drew their attention and the most important thing – how many of them eventually tapped on the download button.

Note: it’s fine if your hypothesis has not been confirmed, you have avoided mistakes and now know what works best for your game or app. It is important to not stop there and run follow-up experiments based on the data obtained. After all, conversion optimization is a continuous process, so don’t give up on A/B testing after just one experiment. 

Apply the results

If your hypothesis is correct, go ahead and make the necessary changes. Replace the icon, screenshots or title with the winning variation. That will help you optimize the conversion rate and get better results on the App Store or Google Play.

Tip: Mobile A/B testing, like ASO, is a continuous process.

You might need a large number  of iterations before you achieve perfection in terms of your app store product page elements and hit the needs of your target audience.

 However, at some point you will achieve it: your app store listing & conversion rate will become optimal, so that all subsequent tests and changes won’t bring you any tangible results. 

Repeat previous steps

But still there are moments and reasons to reoptimize an app store product page and run mobile A/B testing experiments to make sure your app will benefit from it:

  • App Store or Google Play redesign: changes in the app store design require changes to fit the updated user experience and optimize visuals accordingly.
  • Holidays or seasonal events: adjusting app store creatives to a season or specific event may bring you a bunch of downloads. But first, you should check this hypothesis of course.
  • Competitor apps have changed their visuals: borrow some ideas and differentiate from competitors based on visual salience principles – win-win.
  • Emergence of new trends in design.
  • New findings regarding visuals that perform better than others and bring you high conversion rate. For example, new banner ads show amazing results and you think this image might also work for the app store listing. It might not work though, so, run an A/B testing experiment to make sure it works.

If you want to learn more from the team that has been doing A/B testing for apps since 2014, get in touch with SplitMetrics.