How A/B testing and experimenting can provide validating insights for your future ASO strategy


COUP was a major urban mobility service provider. They offered an electric scooter (e-scooter for short) sharing app where users can find, unlock, view and rent any available e-scooter around them. Their services were superior in terms of stable pricing, durable Gogoro batteries, and ease of use. It’s safe to say that, for a long period of time, COUP was the best pick for e-scooter sharing across three major European cities: Berlin, Madrid and Paris. As a market leader in their field, and with great potential in organic user acquisition, COUP became our client in April 2019, when we started helping them develop and run App Store Optimization (ASO) campaigns. The goals are twofold: 
  • To increase the COUP e-scooter app’s organic visibility,
  • Increase its conversion rate (CVR) to make the boosted visibility turn into more installs.
To achieve these goals, what we did for COUP was performing multiple iterations of keyword optimization (KWO) to boost organic search traffic, and running AB tests on their app store screenshots to continuously optimize the app’s CVR. Because we have already talked about keyword optimization in our other case study about CodeCheck, this time around, we’ll talk about our conversion rate optimization (CRO) success story with COUP – via AB testing. AB testing, in a nutshell, is an idea validation method used in CRO that involves experimentation in the app store. When you have an idea to improve CVR, but you aren’t sure if it will work, you can run experiments to see if your idea (and the test variant that embodies that idea) would be superior to what’s currently available.  This method is called AB or split testing because the way it works involves splitting your app users into two or more groups, then serving each group with one variant based on one idea.

The problem:

When we kicked off our ASO project with COUP,  their original set of screenshots was already good. It came in a white background and a clean, modern and clear design. The strategy in place at the time was minimalistic, straightforward and honest. The idea was to technically explain to users what COUP as a shared e-scooter service provider can do. 

However, we felt the technicality might have left out a large number of potential customers of COUP. In our experience, business-to-consumer (B2C) brands like COUP tend to attract more users via emotional, dynamic design or otherwise communication approaches rather than technical. We also had multiple ideas in mind as to what alternative approaches may have looked like. What was missing is a reliable way to identify what would work best – and that’s where AB testing came in.

So, here are the screenshots they were using...

The solution:

We developed three communication strategies with COUP’s app store screenshots to test against the original version. We ran them in three tests, each for one month. We wanted to make sure things would be kept consistent and the traffic each variant would receive was highest possible (an ABCD test would allow each variant only 25% traffic).

Our strategy in this project is not to quickly generate tactical level learnings or quick results – that we can do later – but rather the strategic, high-level insights that help us validate a whole roadmap of AB tests to come in the future.

Let’s have a look at our new test variants created by Customlytics:

Test 1

The concept behind this is a journey that users can receive from COUP, as in “the fastest way from A to B”. The background is a replicated traffic map from the app, and it symbolizes the journey. There are also iconic representations of three main cities that COUP supported, namely Berlin, Paris and Madrid, which says more with less words. We also increased the contrast between the background and the captions, so it would be more legible.

Test 2

This variant took a step closer to the “rebellious” side of marketing. Its approach revolves around promotions with the use of social proof, awards, vouchers and added values. Also, we relied less on the map on the background and played a bit with the second brand color – green – to contrast the blue theme from variant #1. Finally, we “glorified” a bit the scooter, showing it without a driver, to emphasize the “prize” that users could get: high-quality, safe, durable and modern e-scooters.

Test 3

This one goes the furthest towards the rebellious side of marketing, so much so that it looks a bit quirky and out-of-the-box. We wanted to explore the “wild” side of COUP’s users and see if something unusual like this would make a difference. We also treated it as a free-flow testing ground where we played to learn about many aspects of the app’s screenshots rather than staying safe.We combined both brand colors of COUP, we kept the driverless scooter as well as a flat background from variant #2. However, we used variant #1’s story of a journey from A to B, as well as its use of icons – though instead of the cities, we focused more on the product.

The results:

The initial result after we tested variant #1 against the original version (the Control variant) was very positive. According to statistics from Google Play Store Listing Experiment (the primary AB testing platform we use), Test variant #1 would be 20% more effective in CRO than the Control – which is already a major achievement. 

With this result, here is what we did next:

  • We replaced the Control variant screenshots with the Test variant #1 ones.
  • We prepared for the second round with Test variant #2, which would then be tested against Test variant #1.

The result of the second test was a loss. This means Test variant #1 performed better than #2. However, we treated this as a learning and not a failure – it helped us rethink priorities in COUP’s CRO strategy. For instance, we would now prioritize the blue brand color over the green and focus on the storytelling of the journey instead of the promotions.

Next, we produced Test variant #3 and ran it against #1. Similar to the second test, the result was a loss – except the margin was larger. This helped us eliminate most of the doubts we had – now we knew “quirky” wasn’t meant for COUP, and it would be best to avoid this approach.


After a three-month AB testing project, we managed to help our client COUP learn a lot about their market and users. We also helped them establish a foundation for CRO. In shorts, our results include:

Initial Uplift in CVR
+ 0 %
Test Variants
+ 0
Validated Strategy To Scale
+ 0
Happy Client
+ 0
What we managed to help COUP with was important, but it’s just a small piece in app marketing. In cases like this, we recommend running performance marketing campaigns as well to boost the traffic that all A/B tests need to show results with high statistical significance. Want to see what we can do to help other clients in performance marketing? Stay tuned!

Learn more about App Store Optimization

Interested in this service for your mobile app?