Data is used throughout organizations to help make educated decisions, but why does data seem to fall short when it comes to measuring how your product is actually used after the download?
Keith Fenech, Co-Founder of Trackerbird Software Analytics, a V.i. Labs company, will discuss practical tips and tools for measuring end-user application usage, and how you can leverage this data to define and hone your product roadmap.
In this session you will learn:
•Methods for collecting software usage data
•Which actionable metrics you should focus on when measuring application usage
•How this data can help you build better products and convert more users to paying customers
About Keith Fenech
Keith is the Co-Founder of Trackerbird Software Analytics, a V.i. Labs company. Trackerbird provides valuable insight into product runtime and customer usage patterns giving product management the intelligence to make data-driven decisions about their product roadmaps.
Prior to founding Trackerbird, Keith held senior product roles at GFI Software and Yellowbit. Keith has a Masters in Computer Science from the University of Malta.
He founded and manages the Product Management group on LinkedIn consisting of over 81,000 members.
The Science of Software Developing Data-Driven Product Roadmaps (ProductCamp Boston 2016)
1. The Science of Software
Developing Data-Driven Product Roadmaps
ProductCamp Boston - 09 April 2016
Keith Fenech
VP Software Analytics
www.trackerbird.com
2. Housekeeping
Connect to the "Cambridge" wireless network
Open an Internet browser
You'll be redirected to a logon page
When prompted, use the code: pc0409
Follow us @PCampBoston
Official Hashtag – include in tweets:
#PCampBoston
4. About me
• M.Sc. Computer Science
• Specialized in High Performance computing
• 15 years experience in Software Development
and Product Management
• Co-founder/Manager of LinkedIn Group
Networking Product Management (81K+ members)
• Co-founder of Trackerbird Software Analytics
www.trackerbird.com #PCampBoston #SoftwareAnalytics 4
5. Product Managers need data
• Strategic PM decisions impact the whole
organization
• Build Roadmaps
• Prioritize Features
• Measure ROI
– product development
– product marketing
5www.trackerbird.com #PCampBoston #SoftwareAnalytics
6. Where do we get data?
• Mix & Match statistics from various sources
– Download logs
– Call-home logs
– Sales or channel feedback
– Surveys with customers
– Feedback from lost leads
– CRM
Painful + Time Consuming + Inconclusive !
www.trackerbird.com #PCampBoston #SoftwareAnalytics
7. What are companies doing?
• Marketing people rely on web analytics solutions
– Google Analytics, Adobe Analytics, Hubspot, etc.
• Product managers tend to keep their own metrics
or beg engineering to build in-app data collection
• Packaged (downloadable) products tend to
become a black box after download
7www.trackerbird.com #PCampBoston #SoftwareAnalytics
8. Do you know?
• What happens AFTER
download?
• How do users evaluate
your software?
• Which features are used?
• Where should you focus
your R&D ?
• In-app metrics is key
www.trackerbird.com #PCampBoston #SoftwareAnalytics
11. • Across all company sizes
• Across all industries (B2B & B2C)
• 50-75% of companies tried collecting some
form of in-app data from their product
• Most of them are still unhappy with how data
is presented to them
Lack of Actionable data for PM
11www.trackerbird.com #PCampBoston #SoftwareAnalytics
12. • Primary aim is to get actionable metrics
• Vanity metrics have no business value
– Wow, cool chart, now what?!
• Beware of Noise
– Don’t bury yourself in data you cannot consume
Choosing the right metrics
12www.trackerbird.com #PCampBoston #SoftwareAnalytics
13. • Download counts
• Installation counts
• Runtime session count and evaluation duration
• Feature usage and user engagement
• Churn/Dropoffs/Uninstalls (trial activity)
• Customer vs Churn Profiles/Trends
Actionable Metrics (examples)
13www.trackerbird.com #PCampBoston #SoftwareAnalytics
15. • Churn showed users dropping within 1 day.
• Discovered this dropoff due to wizard…
Case Study – Churn leads to Redesign
15www.trackerbird.com #PCampBoston #SoftwareAnalytics
16. • Go beyond data collection
• Convert raw metrics into actionable Business Intelligence reports
• Visualization is crucial to help digest data
• Every team member has different requirements
– Tool must provide easy access to data
– Interactive & customizable
– Ability to drill-down & segment data to answer specific questions in real-time
• Ability to adapt & scale with your business needs
– Tool must scale with you as you grow
– Over time you will require more advanced reporting requirements, so ensure
tool can be used to answer more advanced questions
Choosing the right tool
16www.trackerbird.com #PCampBoston #SoftwareAnalytics
17. • Focus on your core competencies
– Would you build your own web analytics solution?
– Cost of development, infrastructure, maintenance,
time to deploy
• Automated in-app collection is more scalable
& accurate than manual methods (such as
customer surveys & lost leads surveys)
In-app data collection – build vs buy?
17www.trackerbird.com #PCampBoston #SoftwareAnalytics
18. • Actionable Data
• Visualization
• Interactive Reporting
• Scalable tool
Keys to a data driven product roadmap
18www.trackerbird.com #PCampBoston #SoftwareAnalytics
We will discuss practical tips and tools for measuring end-user application usage, and how you can leverage this data to define your product roadmap.
In this session you will learn:
Methods for collecting software usage data
Which actionable metrics you should focus on when measuring application usage
How this data can help you build better products and convert more users to paying customers
Trackerbird Software Analytics has been recently acquired by V.i. Labs. Trackerbird was built specifically to meet the needs of Product Managers. It helps PM track how users are engaging with their software product.
Just like any part of an organization, we Product Managers need data.
PMs are expected to take strategic decisions that effect the whole company
If a PM decides to add a new feature to the Roadmap:
Engineering will need to allocate dev teams to build it
Marketing will spend money to promote it
Sales will have to push it out to the public
Support will have to support customers using it
IDEALLY such strategic decisions should be backed up with FACTS
In my days as a PM, we used to spend hours sifting through downloads, CRM logs, Salesforce data, survey results, etc….
Data is sparse and not readily available!
Lack of FACTS pushed us to take Strategic decisions based on gut feelings!
Answering even the simplest PM questions :
- Which features should we focus on?
- When can we stop supporting an old version/OS?
- Should we design our new UI on 1024px or 1280px?
- Which Marketing campaign yields best conversion rates?
- What is the Churn Rate for a particular edition or marketing campaign?
- What happens AFTER download?
- If you had 1000 DL and 50 sales, what happened to the rest?
- Did they install & trial the product or get stuck during installation stage?
- How long did they spend in Evaluation before buying/dumping the software?
- Is your reseller in Germany converting users faster than those in US?
- Does your trial-to-conversion rate vary by product or by version?
- Maybe you changed the way a Buy Now button looks – did it have any effect on conversions?
How many times do they see the “Expiry notice” before renewal?
How can you understand the sales cycle from a customer’s perspective?
Companies are used to using Web Analytics (such as Google Analytics) to track all the steps people do on their website. This allows them to optimize marketing and number of downloads from the website aspect. However AFTER download takes place they are in the dark of what is happening to their software, what people are doing with it and what leads to conversions/lost leads.
For those of you familiar with the Pragmatic Marketing Framework… In-app data collection hits on various aspects. In particular…. (next slide)
Knowing what happens AFTER users download your software will help you cover:Win/Loss - What are people doing during evaluation? Why did they churn?Roadmap/Requirements – How should we prioritize our feature development?
Pricing – What is more valuable to customers and what are they willing to pay for?
Positioning/Use Scenarios – Is your user-base split into different user groups? Does this require different messaging?
Today we will be focusing mainly on the ROADMAP part….
During the past 5 years I have spoken to hundreds of PMs and software development companies ranging from startups to Fortune500 (our clients)
- Via surveys, sales calls, 1-to-1 meetings and LinkedIn discussions
I learnt that lack of actionable data is an issue that effects ALL company sizes and ALL industries.
Half of companies have attempted to do something about it and collect some form of metrics
Most of these still believe they lack the right tool to analyse the data in an actionable manner
Vanity metrics will only keep you happy for a day, until you realize you cannot take action on them.Examples: number of runtime sessions or clicks on a button – NO CONTEXT
Let’s say Ver3 = 40,000 & Ver4 = 60,000 runtime sessions
Does it mean more users or shorter runtimes?
Noise: We live in a big data world. We are NOT in a race to fill up the biggest DB.
Collecting too much data can make it hard to manage/consume.Try to avoid situations whereby you are collecting way too much data that gets out of hand and you do not have the capacity and resources to consume it, let alone use it to improve your product or business.
Downloads: Only measures user curiosity and effectiveness of your website landing page and CTA.
Installations: How much of your downloaders actually go through with installation and thus you can cut off from the funnel those users who simply download but never install.
Runtime session count and Eval duration: How much time new users need to discover your product and when is the right time to follow up – leaving enough breathing space but not leaving it too late.User Loyalty and engagement. How hooked are users to your product?
Feature usage and User Engagement: What are the core things that users are trying to do with your product? Are they discovering your killer features? What features are used by Churned users?
Do these complement how you are marketing your product?
Churn/Purchases vs Dropoffs/Uninstalls: Are users making use of their 30-day trial or do they drop off after just 30 minutes of evaluation? Are drop offs higher in a particular region or for a specific version or edition? Is it price sensitivity which you can fix through a promotion? Do you have a way to collect feedback from these lost leads?
Customer vs Churn Profiles/Trends – Do your customers or churned users match a specific profile?
Blog source: http://blog.trackerbird.com/content/why-measuring-download-statistics-useless/
Raw data by itself is of little use so you must visualize raw data in reports/dashboards – digestible.
Large % of users lost during first 2 runs (5 mins) after install.This happened on the same day they they installed the software.
After investigating deeper using Event usage activity they discovered most users were quitting in the installation wizard.
RESULT: Company ended up re-designing their post-install user experience
From numbers to insight
Once you have identified the metrics you want to collect, you should find/build a solution that allows you to NOT ONLY collect the data, but to present it in an efficient way by which you can use it to take actionable decisions.
This usually means that the reporting framework you choose must be flexible enough to allow filtering, segmentation, customization so you can extract the right answers.
Source: http://blog.trackerbird.com/content/why-measuring-download-statistics-useless/This will help you avoid situations whereby you are collecting way too much data that gets out of hand and you do not have the capacity and resources to consume most of this data, let alone use it to improve your product or business.
Substantial number of companies claimed that product managers or decision makers within their company would very rarely get all the answers they wanted from their home-built system, and this was mostly due to the limitations or lack of flexibility in the reporting framework, rather than the data being collected.
Therefore don’t be fooled by wearing your developer’s hat and looking at how easy it might seem to collect the data. It’s useless dumping a bunch of data in a database unless you have a method to easily and efficiently extract DIGESTIBLE reports from the data.
Having big data inside a database does not give you any answers. You need a usable and flexible reporting framework that can scale with your product. Building and maintaining such an infrastructure that is both flexible and easily usable by the non-geek in your company, can become a long-term nightmare for your developers. What you need to ask yourself is how easy would it be for a sales person or product manager to build new reports and extract ACTIONABLE answers from the system without involving the developers? If it becomes too much hassle to extract the right reports, over time nobody will use it, and hence you will have lost all the development effort which could have been better used on your product.
If you consider investing in a custom-built call-home system, you need to keep in mind that your own physical or virtual server will need administration time, apart from the general running cost of the server. Hence why a hosted 3rd party solution might be a faster and cheaper to setup.
- See more at: http://blog.trackerbird.com/content/how-much-effort-should-a-startup-invest-into-analytics-and-measurement