No Matter How Sure You Are - Keep Testing

No Matter How Sure You Are - Keep Testing

There’s always a tension between the accumulated knowledge or ‘corporate wisdom’ within any organization and the ‘data-driven’ testing culture. At it’s worst the former simply over-rides or refutes the need for the latter: you have a business driven by the HiPPO, or ‘highest paid person’s opinion’.

Of course many organizations are full of smart people with wide-ranging experience who collectively bring a huge amount to the table when it comes to navigating your business challenges. But the smartest are willing to allow data to challenge their preconceptions - and more importantly open to changing how they operate on the back of test results. They are willing to ‘let go’.

A Case In Point - Space Ape Games

We worked recently with Space Ape on a project that illustrated this approach perfectly. Hopefully the organization themselves need no introduction. As the force behind global smash Samurai Siege, and with a team boasting many, many years of collective experience in the game and mobile app businesses, it’s fair to say they know what they’re about.

So when they devised an interactive tutorial experience that brought users through the key aspects of the game via a prologue battle scene, they were quietly confident they had a winner on their hands. Initial experience in any game is absolutely vital - the typical numbers for Day 1 and 7 retention confirm that - and Space Ape had put significant effort into creating the best experience possible.

“It was an innovative approach, and we’d spoken to plenty of people during development to get a user’s take on it. We were proud of our work, that’s for sure” says Simon Hade, co-founder of Space Ape.

In many organizations that would be considered a job well done. Not for Space Ape - because in addition to those year’s of experience, they have a healthy respect for what player data tells them. And to that end they went out to A/B test their tutorial

 

The Test

Despite a certain fondness for their tutorial masterpiece, the team pitched it against 3 competing variants, each offering a different experience, including one that presented a more traditional tutorial flow. The results demonstrated (in a statistically significant manner) that the traditional approach was delivering better results for their business.

Day 1 Retention was showing an uplift of 6%. Even more significantly, total revenue coming from the group exposed to that variant was up 36% on their control.

And what happened next demonstrated the data-driven mindset at it’s best. The Space Ape team effectively threw away their work.

“We found that our initial prologue appealed to a certain demographic. A demographic that looked an awful lot like ourselves - the developers. No wonder we were pleased with it!” says Hade. “But the truth was that for most of our audience, it wasn’t an optimal experience at all. We proved ourselves wrong - and were happy to do it.”

The lesson: no matter how attached you are to your work, be prepared for your users to prove you wrong. And when they do, accept their judgement and move on!