In the mobile ecosystem, we define a cohort as a group of users that have completed a particular action for the first time within a specific time frame. Do you remember the year you graduated from high school? If it was 1997, then you are in the cohort class of 97. How are your ‘97 classmates doing today? Developers can ask the same question of a cohort of users who, for example, launched your app for the first time during the week of November 13. They can ask: how are they doing now? More specifically, how many are still active? How many have reached a certain level in your app? How many made an in-app purchase? This data becomes particularly revealing after an app update if it occurred just prior to November 13. In effect, by comparing the cohorts after the app update, to the cohorts that came before, which will reveal how long each set of users stays active, you can determine the positive or negative impact of the update on retention.
App publishers should be continuously measuring their apps against three main criteria:
- Improved Retention: Are users more likely to come back to the app?
- Better Engagement: Are users more “active” within the app?
- Increased Monetization: Are users spending more money?
Over the next few posts, we’ll uncover how a cohort analysis can be used for each, but in this post, we’ll focus on Retention. You can measure Retention by examining how long your users stay active (according to your own definition)—whether that be daily active users (DAUs), as is most common in the mobile games industry, or monthly active users (MAUs).
A cohort-based retention analysis is comprised of two components, both of which are based on events you are tracking within the app. The first is the cohorting event, which is how we will group users together, and the second is the retention indicator, which will tell us whether the changes to the app were effective. Here is the breakdown for the simplest example of first-use retention:
- Cohorting event
- Event: “Launched App” (for the first time, by week)
- This allows you to draw distinctions between users that launched the app for the first time in the week immediately prior to your app update.
Retention indicator event
- Event: “Relaunched App”
- Because there is no overlap in the different cohorts, we’ll be able to see whether the update increased retention by comparing the different weeks.
In the screenshot below (click image to enlarge) you see the resulting analysis in a heat-mapped report. You can quickly identify which percentage of each cohort is coming back to the app, week-after-week. The rows represent the different cohorts of new users in each week. The columns represent the number of weeks after the first cohort event (“Launched App”). Again, because there is no overlap between cohorts, you can examine each row to see how well users were retained for separate cohorts.
The update to the app went live on December 26. So the two weekly cohorts (weeks starting December 13 and December 20) represent users that launched the app for the first time with Version 1.0. The weekly cohorts following the update (weeks starting December 27, January 3, etc.) represent users that launched the app for the first time with Version 2.0.
Well, did the update create greater retention? Let’s look closer...
If we look at column one (1), the first two weeks have good retention (as indicated by the orange on the heat map – or with percentages in the second report), but what you can see is that by the third week, the map is bright red, indicating 100% retention of users. This means that post-update, all the users came back to the app during that first week. You can also see that during weeks two (2) and three (3), there’s a higher level of retention. Congrats!
Going just a bit deeper on cohort retention analysis, you may want to determine retention by going beyond new users/first use as your cohort. For example, let’s take a game publisher who sets the cohort event at “reach level 5,” with the retention indicator as “launch app”. This helps that publisher determine how active users within that cohort (namely, those that reached level 5) remain. If gamers come back for seven days after the cohort event, while generally users only stay active in the game for five days, then the game publisher has valuable information indeed. Users who reach level 5 are more engaged; thus, their retention is better. Now the game publisher can experiment with getting more users to reach level 5 in the game, perhaps by providing extra incentives, or by simply making it easier to get there.
This then begins the cycle of continuous improvement available from the discoveries made by using cohort analyses to understand user behaviors and levels of engagement within your apps for greater retention.
While this is obviously an interesting post, I think there is way more interesting and way less covered subject on cohort analysis, which is "What Should I Do With This Data?"