A core component of any good marketing and App Store Optimization (ASO) strategy is the process of conducting A/B tests and making iterative changes based on your findings. A/B testing is useful because it can provide insight into user behavior and help you understand which of your assets perform better, and ultimately drive more conversion. For Google Play, A/B testing has been available through Google Play Experiments since May 2015. Until recently, iOS developers have had to rely on creative set testing via Apple Search Ads or comparing metrics before and after a deployment. With Apple’s recent release of Product Page Optimization those workarounds are now a thing of the past.

Another exciting new feature that Apple rolled out recently is Custom Product Pages. This is often combined with Product Page Optimization, however they are two separate tools with different objectives. Product Page Optimization looks to improve conversion through the use of A/B tests that can ultimately be applied to the original product page. Custom Product Pages, however, focuses on targeting more specific audiences through the creation of different product pages that highlight different app features. These pages will not be live on the App Store as per usual, but are actually tied to a unique shareable URL which can then be included within external marketing channels. Fundamentally, both features aim to drive conversion, however, they do so in entirely different methods.

Features and function of product page optimization

At its core, Product Page Optimization offers iOS developers an A/B testing tool for their creative and metadata assets. Developers now have the opportunity to set up three additional product pages alongside their default product page. One of the large benefits of Product Page Optimization is the ability to submit creative assets for review independent of a new app version. This means that no new build is needed for creative set testing and the turnaround time for cycling through tests and gathering data is much faster. With that being said, it is important to note that any metadata changes will still need to go through Apple’s review process alongside a new app version.

Test pages in Product Page Optimization can include a variety of treatments to creative assets, including the app icon, preview video, and screenshots. A/B tests can be localized, but only if that localization is already live in the App Store. The general rule of thumb when launching an A/B test is to keep a clear objective in mind for each test. Ideally, developers should limit the number of assets changed in order to better deduce which element has led to which result. By making iterative changes to one element at a time, its impact can be measured and incorporated into the next round of tests. Even if a test does not show favorable results for a treatment, this still provides valuable insight into what users prefer seeing and how to update the next test accordingly. Applying such an approach can help apps stay fresh and match current trends amongst competitors.

With Product Page Optimization, developers have the option of choosing up to three different treatments to test against the original product page. Tied to this choice is also deciding on what percentage of people will be shown a treatment over the original product page at random. For example, if 50% of the original traffic is allocated towards the test and two treatments are running, then each treatment will receive 25% of the total traffic. The original product page will receive the remaining 50%. However, it is important to keep in mind that only users who have updated their phones to iOS 15 will be able to see different treatments if a test is live.

Benefits & importance of testing

Overall, the goal of A/B testing is to find out which assets can improve conversion. For creatives specifically there are a number of benefits to consider. Conducting tests on creatives can help developers determine if a certain value proposition resonates better with users, if certain color themes or design styles increase conversion, or if incorporating seasonal content has an impact on app downloads. Additionally, if an app is live in different territories, creative set testing can also help assess whether particular features or characters are more relevant to that specific target audience.

By testing different variants, you can significantly reduce the time needed to assess what kind of strategy to move forward with. Instead of waiting on App Store Connect to collect data every time a new deployment is rolled out, Product Page Optimization can give developers a side-by-side comparison of how live assets are performing against a treatment. Ultimately, A/B testing can provide clear guidance on what target demographics are looking for and can help improve conversion through discovering insights over time and making iterative changes.

Tracking performance

As with other performance metrics, Apple provides its data for each test in App Analytics. There, developers can view the number of impressions, conversion rate, percent improvement, and confidence level compared to a baseline of choice. On its default settings, a test’s baseline will be the original product page for an app. Depending on how many treatments are running, the baseline can be changed at any time in order to compare performance. In order to give developers a better understanding of the current test status, additional indicators appear over time depending on how much data is available.

When it comes to testing creatives, the conversion rate is one of the most important metrics to consider. Since creative assets like screenshots are not indexed by Apple, their impact lies in the eyes of the beholder, in this case the user scrolling through the App Store. The conversion rate measures how often users download an app upon viewing it. When testing different treatments, the conversion rate demonstrates which creative set resonates better with users and convinces them to download more. Beyond conversion rate, another core metric to look at is the confidence level. This represents the probability that the data gathered in a test suggests that two variants are performing differently. When App Analytics indicates that a treatment is “performing better”, this means that it is performing better than the baseline with at least a 90% confidence. This implies that if the test were to be repeated, you will reach the same results 90% of the time. The same principle can be applied when Apple indicates that a treatment is “performing worse”.

Breaking down metrics and determining core findings for each A/B test is a valuable step in order to continue the process of iterative testing. As developers gain information on what variant performs best, it is important to continue thinking about what the accrued data means and what to test next. Moving forward, each winning variant that is applied should help convert even more users as you are building on previous insights. This strategy can help widen the funnel and successfully convert a wider range of users.

Overall

Apple’s new Product Page Optimization feature is a game changer for iOS developers and App Store Optimization. By introducing a native A/B testing tool, iOS developers now have the opportunity to make changes to creative assets faster and go through iterative testing in order to continuously improve conversion. Apple’s new submission experience means that Product Page Optimization tests can be submitted for review independent of a new app version. These new actionable insights into conversion present an exciting opportunity for developers and ASO experts to get ready, and start testing.