18. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
1 Product Stickiness (DAU/MAU, WAU/MAU)
The Daily Active Users (DAU) to
Monthly Active Users (MAU) Ratio
measures the stickiness of your
product - that is, how often people
engage with your product.
19. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
With any feature set, you will have your
usual suspects in terms of “most used
features.” And there will always be those
features that are higher (and lower) than
what you’ve assumed
2 Product Usage: The Highs and Lows
Top 10 Features
1. Pages
1. Segment Dropdown
1. Visitors
1. Behavior
1. Features
1. Accounts
1. Dashboard
1. Guides (global)
1. Visitor List: search
1. Guide Center: Badge icon
. . .
361. New Ongoing Goal
370. NPS email
45,559
199,232
39,172
23,384
44,061
38,100
18,201
34,435
43,750
126,421
. . .
115
139
# of Events
42.1
42.0
41.7
41.0
40.9
40.3
39.8
39.7
39.5
39.0
. . .
2.7
2.4
BDF Score
In this example, I want to dig into “New
Ongoing Goal” and “NPS email.”
20. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
The first step in setting
goals for feature adoption is
understanding adoption of
your most recent feature
launches.
3 Typical Adoption of New Features
21. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
Do your customers consistently
use your features - which parts of
your product do people come back
to--which areas are one and done?
4 Product and Feature Retention / Churn Rate
Cohort view of PMs and CSMs using “Trends”
22. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
Be it conversion from free trial
to paid, or conversion on
upsell, this metric is an import
metric to keep track of, and
move the needle on.
5 Conversion Rate
Change made that materially impacts the conversion rate of product
23. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
6a Account-level NPS
Unique to B2B software,
analyzing NPS at the account-
level can offer a tremendous
amount of insight.
This section accounts for more than 60% of our ARR - how will we
understand sentiment for these accounts?
24. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
6b User-level NPS
For user level NPS, I want to
know the overall scores and by
persona.
25. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
Leading indicators are powerful.
For example, knowing the early
signals of an unhealthy account
can lead to early actions--actions
that can save the account.
7 Leading indicators of
retention and expansion
Visitors are likely to return to Pendo
often if they:
• Visit at least 3 times in their first
10 days
• Use the Account Analytics
feature group within their first 30
days at Pendo
V I S I T O R S
Accounts are likely to renew and expand
if during their first 90 days they:
• Use the Account Analytics
feature group at least 6.3
min/day used on average
• Had at least 6 unique visitors
• Spent 17 min/day - average time
on site
A C C O U N T S
26. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
8 Top Feature Requests
Often overlooked as an analytic,
top feature requests should be on
the PMs standard set of metrics
they are looking at on a regular
basis.
27. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
9 Performance
Often app Performance is thought
of as an engineering only metric.
This is a mistake. Product team
needs to know this inside out as
well.
28. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
10 Bugs Reported vs Solved
For some applications, all bugs
must be squashed. For others,
bugs are acceptable. But PMs
must know where they stand, and
how much time to dedicate to this
Sisyphean task.
29. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
11 Delivery Forecastability
The rest of the business relies on
the product and engineering
getting features and changes out
the door. Predictability is often
more important than velocity.
30. 1 1 M E T R I C S E V E R Y PM S H O U L D K N O W
Honorable Mention:
Customer Lifetime Value (CLV)
Clearly core a SaaS business as a
whole, and sometimes important
for the PM to have a solid grasp on.
31. www.productschool.com
Part-time Product Management, Coding, Data Analytics, Digital
Marketing, UX Design and Product Leadership courses in San
Francisco, Silicon Valley, New York, Santa Monica, Los Angeles,
Austin, Boston, Boulder, Chicago, Denver, Orange County,
Seattle, Bellevue, Washington DC, Toronto, London and Online
Editor's Notes
Everyone has a bent, a history that taints, or perhaps “flavors” their views. Here is mine.
Google - very large data sets. B2C focus. Care about the end user, but generally not about revenue.
Web analytics are imprecise, but they should paint a picture to give you understanding.
The horrors that are out there….
Analytics blockers filtering out data
Bots artificially inflating data
People running Internet Explorer 5 on a Nintendo Game boy
Different stages of a company or product or project may call for different analysis.
B2C vs B2B vs B2G
Early stage it’s often more about understanding an individual’s behaviors and motives, while later stage its about understanding groups and segments of groups.
Product intersects with most of the rest of the business, and you will see that analytics will do the same. Not sole owners of some of these metrics, but need to know about them.
Why analytics?
Understanding. Conversation. And ultimately, to change behavior.
The Daily Active Users (DAU) to Monthly Active Users (MAU) Ratio measures the stickiness of your product - that is, how often people engage with your product.
DAU is the number of unique users who engage with your product in a one-day window. MAU is the number of unique users who engage with your product over a 30-day window (usually a rolling 30 days).
The ratio of DAU to MAU is the proportion of monthly active users who engage with your product in a one-day window.
I want to be sure I have a narrative--or a plan--esp around those features that have surprisingly low engagement or BDF.
The first step with any insight is to have the context--to have the data to point you in the right direction of exploration.
This view can also help you decide if there is feature/functionality that you should sunset/delete. Something that can be just as valuable as launching new features.
In this case, we see an initial adoption of two major feature launches around 45 and 50% respectively, with around a 10% drop after the initial marketing campaigns.
With feature launches, I like to first understand adoption at the account level because it captures users in the account that are interested and doesn’t discount for users that don’t need this feature.
Second, I look at adoption at the user level to better understand the adoption of my target persona.
Understanding retention at the feature level will help drive more specific actions in order to increase engagement.
The metrics helps you answer the question “which customers should I talk to in order to better understand the value or lack of value for a particular feature”?
While not applicable to all businesses, this metric can be instrumental to the success of a company or product, and therefore needs to be top of mind for PMs.
The goal here of course, is to increase this metric over time through improved functionality, improved usability, and improved experience in general.
When combining product usage and ARR size, this scatterplot quickly highlights accounts that may be at risk. It should prompt immediate action if not yet accounted for.
It’s also important to know and understand those accounts that have not provided us NPS. It prompts the need to find alternative ways of understanding sentiment--especially for the larger accounts.
Ultimately, you are building for a specific persona, so it’s likely that score is higher than overall.
Reading each verbatim and pulling out themes is also helpful when looking to drive improvements in the score.
Remember, NPS measures the likelihood to recommend. Word of mouth is number one - there is no number two.
Here, we have the outputs of some regression testing to highlight correlation between activity during the first 90 days and retention/expansion.
PMs need to better understand key drivers of the business: why we win/lose, and what drives attrition/renewal/expansion.
As we all know, just because it is being requested by someone, doesn’t mean that we must build it, but it’s naive to not have the data in front of you at all times.
What is 5% percentile performance for your app? What about 95% percentile? Which parts of the app are slow? What are the product trade offs you could make to improve performance?
The graph here is shown at an individual sprint level, but you need to prove to the company as a whole that you can not only hit a sprint’s commit, but can hit longer term project commits. B2C sometimes gets a pass on this, but not always.
The higher $ the customer pays, and the longer they stay, the higher the CLV. This has massive effects on the business as a whole for SaaS businesses. If the business is spending a $10k on marketing and sales to get a customer in the door who is paging $1k a month, and they churn after a couple months, this is clearly a problem. PMs are responsible for making sure the customer is getting their money’s worth.