Why Flurry's retention metrics are misleading developers

Playground's Wilhelm Taht reveals all

Why Flurry's retention metrics are misleading developers

I truly applaud the amazing footprint Flurry has achieved in the mobile app ecosystem. It's not easy for a new company to carve out its space in a growing and crowded marketplace.

Flurry has been at the forefront of teaching developers, whether indie or not, the importance of understanding user behaviour and analytics.

On top of that, it has become the close to "de-facto" analytics toolkit that every up and coming developer uses in their releases. This is an amazing achievement given the especially competitive nature of the mobile sector.

Flurry packs in a ton of useful features, and when you put its analytics to appropriate use, you can extract some truly valuable information.

However, there's one rather crucial area where both Flurry and its credibility as an analytics platform stumbles. Let me explain.

Devil is in the detail

Retention is the number one key metric upon which all other elements of a well functioning freemium economy are built. Retention is the foundation for a freemium economy.

Don't believe me? Check out Eric Seufert's excellent Minimum Viable Metrics for some background information.

As Eric explains, rentention enables the other three pillars of a freemium economy - engagement, virality and monetisation - to work, at least on a theoretical level. Consequently, without appropriate retention metrics, the other three pillars fall apart.

Wilhelm Taht

Retention is the #1 metric that developers and publishers study when a freemium game or app is launched. It's studied in detail when "soft launching" a game in a single country (just as Supercell, for example, is doing at this very moment doing in Canada with Boom Beach and Glu with FrontLine Commando 2).

Retention is studied in detail by management of developers and publishers of freemium games. It is studied in detail by venture capitalists. It is studied in detail by product managers. It is an important metric for a freemium product.

Simply put, the retention metric the industry is interested in is defined as the "percentage of users that came back to your game on the Nth day after first using the app", better known as 'Day N Retention'.

When looking at Retention figures, the industry usually looks specifically for day 1, day 7, and day 30 figures as the benchmark for calculating a user's lifetime. And eventually the Lifetime Value of the user, when adding monetisation to the equation.

In the world of freemium mobile and tablet games, the minimum retention thresholds that the industry is looking for are usually defined as:

  • Day 1: minimum 30 percent (preferably 40 percent) of users who downloaded the game on "Day 0" return to the game
  • Day 7: minimum 15 percent (preferably 20 percent) of users who downloaded the game on "Day 0" return to the game
  • Day 30: minimum 8 percent (preferably 10 percent) of users who downloaded the game on "Day 0" return to the game

Unknowingly to majority of the industry, Flurry does not show this type of metric as retention. Instead something Flurry shows something it labels Rolling Retention.

Flurry's definition for Rolling Retention is the share of users that came back to your app on the Nth day after first using the app, or any day after the Nth day.

At first glance, the difference might not seem apparent or crucial, but when you dig a little deeper it becomes clear that it is.

Simply put, 'rolling retention' has absolutely nothing to do with our previous definition of retention. In fact, what the metric shows is the opposite of churn (100 percent – Your 'Rolling Retention' stat = your churn).

The real issue

Churn is defined as how many users have quit your game "forever".

Usually – though not always - developers and publishers quote Flurry's Rolling Retention metric as 'Day N Retention', which it isn't. Now, for a publisher or a developer of mobile games, the metric becomes especially problematic for a few reasons:

  • The metric will always be higher for a game (or an app for that matter) that has been on the market for a long time.
  • The longer an app is installed on a user's device, the more likely a user opens it again. When the user does open it again, he or she is put into the pool of retained users across the full time period - I've seen slightly older titles that show a 90 percent day 30 retention, using Flurry's 'rolling retention'. If only
  • The metric gives very limited information on the true nature of the game or apps quality or user base
  • Indie developers are unaware of what the "rolling retention" metric means, and will make uninformed decisions on it. Or alternatively, waste time in development cycles when adding other analytics toolkits, as they finally understand that Flurry does not show the metric they are looking for
  • Venture Capitalists and Product Fund Managers are unaware of what the metric means, and might make uninformed investment decisions based on it. (VC's - please do take note of this!)
  • Publishers are time after time again shown excellent retention metrics for developers' apps and games, which are completely useless for informed decision making

Long story short, the "rolling retention" metric Flurry pushes is not something that gives any valuable information to its observer.

In fact, the only thing the metric is good for is to display to uninformed venture capitalists when fishing for an investment, along with other non valuable metrics such as "total amount of downloads", for example.

Having endured this same conversation with countless developers explaining what the problem with Flurry's take on retention is – andknowing that we at Playground Publishing aren't the only ones to take issue with it – I can only hope Flurry quickly shifts is approach to retention for the good of all its users.

UPDATE: Flurry has publicly responded promising to provide 'static retention' (or 'Day N retention') within the next few weeks.

Wilhelm Taht is COO at Playground Publishing having worked within the fields of online, mobile and social media for the last decade. He also serves as an adviser to several startups in the fields of games, media and music. You can follow Wilheim on Twitter here.

PocketGamer.biz regularly posts content from a variety of guest writers across the games industry. These encompass a wide range of topics and people from different backgrounds and diversities, sharing their opinion on the hottest trending topics, undiscovered gems and what the future of the business holds.


View options
  • Order by latest to oldest
  • Order by oldest to latest
  • Show all replies
Simon Newstead
@Wilhelm - great they finally added static cohort retention as well as rolling retention.

In our games we have the internal goal to exceed 35% for second day, 25% for third day and 10% for 7th day. We use a combination of Flurry and our own mongoDB for cohort analysis (were using Apsalar but got a bit buggy)

In addition to quantitative we also run a bunch of qualitative in-game surveys (polljoy) to see how users are feeling and what's working and not for them.

Only when the numbers look good and the user's are telling us they love the game would we go forward to launch. This can take many months of iteration though - expectations have risen a lot in the last 2 years which is a sign games are getting much better.
Wilhelm Taht
@Babar -- that is an excellent tip. Thanks for sharing!

@Brad -- yes, churn is an important metric. But imho it is of very little relevance to a freemium game (or app for that matter).. Churn is a especially relevant for services or products who monetize through a subscription. Or when looking holistically at industries. I used to talk about "consumer churn from mobile games" when dealing with the dumphone mobile carrier games business. That churn was huge and ugly.

I think the issue here is how Flurry labels the stat, it causes confusion. If the intention is to show churn (or the inverse of churn), then show churn and call it churn.
Babar Ahmed CEO at Mindstorm Studios
Excellent point Wilheim. I actually gave up using Flurry's Rolling Retention a year ago when I noticed the problem. Interestingly, though, there's a very accurate metric that Flurry DOES provide for retention, but its not labeled meaningfully in my opinion. The metric I'm referring to is their "Recent Users Percentage". This shows you the % of users who were new on Day N that are still active within the most recent week, and that's just perfect.

So, for example, if today is Feb 2, I can get a a fairly accurate Day 30 retention by looking at the number around Dec 30th or so. I've found the retention to match accurately with cohorted retention measurement that I'm cross checking with other analytics SDKs.

Hope that helps folk wanting to use Flurry to get accurate Day N retention numbers.
Brad Jones Director, Product Management at Flurry
Wilhelm - Thank you for bringing up an important topic. Rolling Retention is very useful when you want to understand your app's churn- as you correctly note is the inverse of Flurry's Rolling Retention. While it is never our intention to mislead or confuse developers, we agree that specific-day retention is a very important metric that is currently missing from Flurry Analytics. We have been privately beta testing a solution that we'll roll out to more customers very soon. Stay tuned.

Brad Jones, Director of Product Management, Flurry Analytics
Wilhelm Taht
I don't think you can even estimate it... The only times you can take a snapshot / estimate Day N retention, is on Day N.

I.e. you launch the game on Day X. You acquire on that same day 10,000 users. Then, on Day X + 7, you can see the Day 7 retention. But on Day 8 you cannot see it anymore in Flurry. "There it is, there it is.. it's gone".

I didn't mention this specifically indeed. It's directly a cause of how the stat is calculated within Flurry.
Julian Runge
"Flurry's definition for Rolling Retention is the share of users that came back to your app on the Nth day after first using the app, or any day after the Nth day."

I agree with your rejection of Flurry's approach to measuring retention. There is one more and maybe the main concern when you optimize data-driven: You cannot calculate Flurry's retention metric on day N+1 with certainty. At best, you can estimate it. Didn't see this mentioned in the article.