We talk a lot about a/b testing here on MarketingExperiments. What we don’t usually talk about is a/b testing for the mobile web…especially testing within mobile apps.
I thought we should change that. As I was scouring the web looking for mobile a/b tests, I found this 2-year old video by Amazon.
Apparently, Amazon Web Services (AWS) at one point had an a/b testing feature that is now closed
When they had testing, however, one app developer used it extensively and shared their experiences in a promotional video for the feature on Amazon. The developers were behind the game Air Patriots. Russell Caroll was the Senior Producer for the game and Julio Gorge was the Game Development Engineer. The game is a kind of aerial take on the classic tower defense game genre.
Now granted, this was a promotional video, but the content still speaks for itself. These guys had (and still have by the looks of it) a fairly successful mobile app and they ran some successful tests. It’s a great starting place for what you can test in your mobile app.
By the way, while Amazon has shut down its a/b testing feature, there are a lot of other tools for testing mobile apps that will accomplish the same thing the developers talk about in the video.|8ab1f6c5e11fafbad5faead193d96812|
The first thing the team tested was the impact of ads are on their customers. They wanted to make sure the ads did not harm the customer experience. So they tested a single ad in the main menu near the bottom of the screen.
They found that the ads didn’t affect customer retention. This meant that they could insert ads and generate more revenue without hurting their customers.|34acbfc715cb1b053b5776fe02d5b8b1|
In the second test, the team put ads in the game screen.
In both the first and second tests, the ads had a little “X” that the customers could tap to hypothetically dismiss the ads. When they tapped, a pop up came up that told customers they could eliminate ads with any purchase in the game’s store.
In this test, there was again, no impact on customer retention, but there was a statistically significant increase in revenue.|ddb4074dd6dc6cac157dba2b14bf3adf|
In this test, the team wanted to know whether an icon to the game-circle (Amazon’s game stats and leaderboards portal) would improve performance.
It’s not clear which icon won, or even why this particular test was useful for the team, but they did get a favorable result, and the lesson they wanted to drive home was that simple changes like icons can make a difference. We’ve, of course found that to be the case in a large number of our tests on MarketingExperiments.|7c61f189f6b254c4713cbd6cd2b22857|
In this 4th test, Caroll made a mistake. He accidentally changed the game difficulty to make it about 10% harder. As a result, every metric that was important to them tanked.
The team of course fixed it as fast as possible, but it gave them an idea.
What would happen to revenue if they made the game easier?
So they ran a test that had 5 treatments: The control and then 4 difficulty levels that were easier than the control.
It turned out that the easiest difficulty performed the best. By making it easier, they players playing 20% longer and revenue went up 20%.|2c1e2ce0bf2ca0d4a2c060762aa67441|
The team then tested a push-notification that offered inactive players and incentive for picking the game back up.
They wanted to know when the best time to send the notification would be. So they tested a few different variables and found that the best time was 3 days after the last game play.
They also found that sending the notification 7 days after the last game play negatively impacted their performance metrics.
With these 5 tests and probably a few more that have been happening off the record, the team was able to develop a great app for their customers and steadily increase their revenue. At the end of the video Carol gives a few key takeaways for marketers who are a/b testing their apps.
~ Salvador Dali