What This Template Is For
Shipping a feature is the easy part. Knowing whether anyone actually uses it is harder. Most teams ship, celebrate, and move on. They check a single usage number a week later, declare victory or defeat, and never look deeper.
That shallow approach misses critical questions. Which user segments adopted the feature? How deep does usage go beyond the first click? How long after release do users discover it? Is adoption growing, plateauing, or declining over time? Without answers, you cannot distinguish a feature that needs better onboarding from one that solves a problem nobody has.
This template helps you define what "adopted" means for a specific feature, choose the right metrics and dimensions, specify dashboard visualizations, and set adoption targets. It pairs with the Product Analytics Handbook for measurement methodology and the activation rate glossary entry for related benchmarks.
Use the feature adoption rate metric to calibrate your targets. If you need to decide which features to invest in next, the RICE Calculator helps you prioritize based on reach and impact.
When to Use This Template
- After shipping a new feature. Build the dashboard before launch so you can track adoption from day one.
- When stakeholders ask "is it working?" Replace anecdotes with a shared dashboard that answers the question with data.
- During quarterly planning. Use adoption data to decide which features need investment, iteration, or deprecation.
- When running a rollout or beta. Track adoption across rollout stages to catch problems before a full release.
- When comparing cohorts. Understand whether new users adopt a feature faster than existing users who need to change habits.
How to Use This Template
Step 1: Define the Adoption Event
Pick the specific action that constitutes "adopted." A single page view is awareness, not adoption. Adoption means the user completed the core action the feature enables. Be precise. "Used the reporting feature" is vague. "Generated and exported a report" is measurable.
Step 2: Identify Supporting Metrics
Beyond the primary adoption event, define metrics for discovery (saw the feature), trial (tried it once), repeated use (used it 3+ times), and depth (used advanced capabilities). These layers tell you where users drop off.
Step 3: Choose Dimensions
Decide how to slice the data. Common dimensions: user segment, plan tier, acquisition cohort, role, company size, and platform. Start with 3-4 dimensions that map to decisions you would actually make.
Step 4: Specify Visualizations
For each metric, define the chart type, time range, and comparison baseline. Line charts for trends, bar charts for segment comparisons, tables for detailed breakdowns.
Step 5: Set Targets and Alerts
Set a 30-day and 90-day adoption target. Configure alerts for adoption rates dropping below threshold or flattening before target.
The Template
Feature Adoption Definition
| Field | Value |
|---|---|
| Feature name | [Feature name] |
| Ship date | [YYYY-MM-DD] |
| Target users | [Who this feature is built for] |
| Eligible user count | [N users who have access] |
| Adoption event | [Specific action that = adopted] |
| Discovery event | [How users find the feature] |
| Trial event | [First meaningful interaction] |
| Repeat threshold | [N uses within X days = repeated] |
| Depth event | [Advanced/power usage action] |
Adoption Funnel Metrics
| Metric | Definition | Calculation | Target (30d) | Target (90d) |
|---|---|---|---|---|
| Discovery rate | % of eligible users who saw the feature | Unique users with discovery event / eligible users | [%] | [%] |
| Trial rate | % of discoverers who tried it once | Unique users with trial event / discoverers | [%] | [%] |
| Adoption rate | % of eligible users who completed adoption event | Unique adopters / eligible users | [%] | [%] |
| Repeat rate | % of adopters who used it N+ times in X days | Repeat users / adopters | [%] | [%] |
| Depth rate | % of adopters who used advanced capabilities | Deep users / adopters | [%] | [%] |
| Time to adopt | Median days from eligibility to adoption event | Median(adoption_date - eligible_date) | [N days] | [N days] |
Dashboard Specifications
| Panel | Chart Type | Metric | Time Range | Comparison | Dimensions |
|---|---|---|---|---|---|
| Adoption trend | Line chart | Cumulative adoption rate | Launch to now, daily | Target line overlay | None (aggregate) |
| Funnel | Horizontal bar | Discovery > Trial > Adopt > Repeat > Depth | Rolling 30 days | Prior 30 days | None (aggregate) |
| Adoption by segment | Stacked bar | Adoption rate | Rolling 30 days | None | [Segment 1] |
| Adoption by plan | Grouped bar | Adoption rate | Rolling 30 days | None | Plan tier |
| Time to adopt | Histogram | Days to adoption | Since launch | None | None |
| Adoption by cohort | Heatmap | Adoption rate | Last 8 cohorts | Color intensity | Weekly cohort |
| Usage frequency | Bar | Sessions per adopter per week | Rolling 4 weeks | None | None |
| Depth breakdown | Pie/donut | % of adopters at each depth level | Rolling 30 days | None | None |
Segment Analysis
| Segment | Eligible Users | Discovery Rate | Trial Rate | Adoption Rate | Repeat Rate | Time to Adopt | Notes |
|---|---|---|---|---|---|---|---|
| [Segment A] | [N] | [%] | [%] | [%] | [%] | [N days] | [Notes] |
| [Segment B] | [N] | [%] | [%] | [%] | [%] | [N days] | [Notes] |
| [Segment C] | [N] | [%] | [%] | [%] | [%] | [N days] | [Notes] |
| [Segment D] | [N] | [%] | [%] | [%] | [%] | [N days] | [Notes] |
Highest-performing segment: [Segment] at [%] adoption rate
Lowest-performing segment: [Segment] at [%] adoption rate
Action: [What you will do differently for the underperforming segment]
Alert Configuration
| Alert | Condition | Threshold | Channel | Frequency |
|---|---|---|---|---|
| Low adoption | 7-day rolling adoption rate below target | < [%] | [Slack / email] | Daily |
| Adoption plateau | 7-day growth in cumulative adoption < threshold | < [N] new adopters/day | [Slack / email] | Daily |
| Discovery gap | Discovery rate below expected with rollout complete | < [%] | [Slack / email] | Weekly |
| Depth decline | Repeat rate drops from prior period | > [X pp] decline | [Slack / email] | Weekly |
Dashboard Checklist
- ☐ Defined primary adoption event (specific, instrumentable action)
- ☐ Defined supporting metrics: discovery, trial, repeat, depth
- ☐ Selected 3-4 analysis dimensions
- ☐ Specified chart types and time ranges for each panel
- ☐ Set 30-day and 90-day adoption targets
- ☐ Configured alerts for adoption decline and plateau
- ☐ Verified all events are instrumented in analytics
- ☐ Reviewed dashboard with engineering to confirm data availability
- ☐ Shared dashboard link with stakeholders
- ☐ Scheduled 30-day adoption review meeting
Filled Example: Project Management Tool (Task Dependencies Feature)
Feature Adoption Definition
| Field | Value |
|---|---|
| Feature name | Task Dependencies |
| Ship date | 2026-02-10 |
| Target users | Team leads and project managers on Team and Enterprise plans |
| Eligible user count | 14,200 users |
| Adoption event | Created a dependency link between two tasks |
| Discovery event | Viewed the dependency tooltip or help article |
| Trial event | Opened the "Add dependency" dialog |
| Repeat threshold | Created 3+ dependencies within 14 days |
| Depth event | Used the critical path view (requires 5+ dependencies) |
Adoption Metrics (30 days post-launch)
| Metric | Value | Target | Status |
|---|---|---|---|
| Discovery rate | 62% | 50% | Above target |
| Trial rate | 41% | 35% | Above target |
| Adoption rate | 28% | 25% | Above target |
| Repeat rate | 54% | 50% | Above target |
| Depth rate | 12% | 20% | Below target |
| Time to adopt | 6 days | 10 days | Above target |
Key Finding
Discovery and trial rates exceeded targets. Users found the feature and tried it. But depth rate (12% vs. 20% target) signals that adopters are not progressing to the critical path view. Session recordings reveal that users did not realize the critical path view existed. It requires 5+ dependencies, and most adopters created 3-4. The onboarding tooltip mentions dependencies but not the critical path payoff.
Action: Add an in-app nudge when a user has 4 dependencies: "Add one more dependency to unlock the Critical Path view." Projected to increase depth rate to 22%.
Key Takeaways
- Define "adopted" as a specific action, not a page view or click. Adoption means the user completed the core task the feature enables
- Track the full funnel: discovery, trial, adoption, repeat, and depth. Each layer reveals a different class of problem
- Segment the data by at least 3 dimensions. Aggregate adoption rates hide the segments that need different treatment
- Set targets before launch so you can evaluate objectively. Retroactive targets invite bias
- Configure alerts for declining or plateauing adoption so you catch problems before the next quarterly review
About This Template
Created by: Tim Adair
Last Updated: 3/5/2026
Version: 1.0.0
License: Free for personal and commercial use
