What This Template Is For
Cohort analysis groups users by a shared characteristic (usually signup date) and tracks their behavior over time. It answers questions that aggregate metrics hide: "Is our October cohort retaining better than our August cohort?" and "Did the onboarding change actually improve long-term retention, or just short-term activation?"
Without cohort analysis, improving metrics is a guessing game. Your overall retention rate might be 40%, but that number blends users who signed up three years ago (and are deeply embedded) with users who signed up last week (and may never return). Cohort analysis separates these groups so you can see whether your product is genuinely getting better at retaining new users.
This template covers two types of cohort analysis: time-based (grouped by signup week or month) and behavioral (grouped by actions taken). It includes retention curve setup, cohort retention curve formatting, segmentation criteria, and an analysis framework. For the strategic context, the Product Analytics Handbook covers retention measurement in depth. The glossary entry on cohort analysis provides a quick primer on the concept. To calculate retention rates from your cohort data, the day-7 retention and day-30 retention metric guides explain standard calculation methods.
How to Use This Template
- Define your cohort grouping. For most SaaS products, start with weekly signup cohorts. Monthly cohorts work for lower-volume products.
- Define the retention event. This is the action that counts as "retained." It should be a core action, not a passive one. Logging in is too passive. Completing a task, sending a message, or creating a document are better.
- Set your time windows. Standard windows are D1, D7, D14, D30, D60, D90. Choose windows that match your product's natural usage cadence.
- Pull the data and fill in the retention grid. Each cell shows the % of the cohort that performed the retention event in that time window.
- Look for patterns: is retention improving cohort-over-cohort? Is there a consistent drop-off point? Do certain behavioral segments retain better?
- Run the analysis at a consistent cadence (weekly or monthly) and compare trends over time.
The Template
Cohort Definition
| Field | Details |
|---|---|
| Analysis Name | [e.g., "Q1 2026 Weekly Signup Cohort Retention"] |
| Cohort Type | [Time-based / Behavioral] |
| Grouping | [Weekly signup / Monthly signup / Feature adoption / Plan type] |
| Retention Event | [The specific action that counts as "retained," e.g., "completed at least 1 task"] |
| Time Windows | [e.g., D1, D7, D14, D30, D60, D90] |
| Data Source | [Analytics tool and dataset] |
| Exclusions | [e.g., "Exclude internal test accounts, exclude users who signed up and cancelled within 24 hours"] |
Time-Based Retention Grid
Each row is a cohort (grouped by signup period). Each column is a time window. Cells show the % of users from that cohort who performed the retention event in that window.
| Cohort | Size | D1 | D7 | D14 | D30 | D60 | D90 |
|---|---|---|---|---|---|---|---|
| [Week 1] | [N] | [%] | [%] | [%] | [%] | [%] | [%] |
| [Week 2] | [N] | [%] | [%] | [%] | [%] | [%] | |
| [Week 3] | [N] | [%] | [%] | [%] | [%] | ||
| [Week 4] | [N] | [%] | [%] | [%] | |||
| [Week 5] | [N] | [%] | [%] | ||||
| [Week 6] | [N] | [%] |
Reading the grid:
- Read down columns to see if retention is improving over time (newer cohorts vs. older)
- Read across rows to see the retention curve shape for each cohort
- Empty cells mean insufficient time has passed for that cohort to reach that window
Retention Curve Shape Analysis
Plot the average retention curve and identify the shape:
| Shape | What It Means | Action |
|---|---|---|
| Steep early drop, then flat | Users who survive the first week tend to stay. Onboarding is the problem. | Improve activation and onboarding |
| Gradual decline, no flattening | Product is not forming a habit. Users drift away over time. | Increase engagement loops and re-engagement triggers |
| Flat from day 1 | Very high retention. Either a sticky product or a biased sample. | Validate the retention event is meaningful, not trivially easy |
| Staircase drops | Retention drops at billing cycles (D30, D60). | Improve perceived value before renewal and reduce involuntary churn |
Behavioral Cohort Segmentation
Beyond time-based cohorts, segment users by actions they took (or did not take) during a defined window (typically the first 7 or 14 days).
| Segment | Definition | Cohort Size | D30 Retention | D60 Retention |
|---|---|---|---|---|
| [e.g., "Power onboarders"] | [Completed all onboarding steps in first 24 hours] | [N] | [%] | [%] |
| [e.g., "Inviters"] | [Invited at least 1 teammate in first 7 days] | [N] | [%] | [%] |
| [e.g., "Solo users"] | [Never invited a teammate in first 30 days] | [N] | [%] | [%] |
| [e.g., "Feature explorers"] | [Used 3+ distinct features in first 7 days] | [N] | [%] | [%] |
| [e.g., "Single-feature users"] | [Used only 1 feature in first 7 days] | [N] | [%] | [%] |
Key question: Which first-week behaviors predict 90-day retention? The behavioral segments with the highest D90 retention contain the actions you should be nudging all new users toward.
Analysis Framework
For each analysis cycle, answer these five questions:
- ☐ Is retention improving? Compare the latest cohort's D7/D14/D30 to the same window in cohorts from 4, 8, and 12 weeks ago.
- ☐ Where is the biggest drop-off? Identify the time window with the steepest decline. This is your highest-impact improvement opportunity.
- ☐ Which behavioral segment retains best? Identify the first-week behaviors that correlate with high long-term retention.
- ☐ Did recent changes move the needle? Compare cohorts before and after a specific product change (new feature, onboarding redesign, pricing change).
- ☐ Are there segment-specific issues? Break retention by plan type, acquisition channel, company size, or geography to find hidden problems.
Tracking Changes Over Time
| Date | Change | Affected Cohorts | Expected Impact | Actual Impact |
|---|---|---|---|---|
| [Date] | [e.g., "Launched guided onboarding"] | [Week 10+] | [+5 pp D7 retention] | [Pending / +X pp] |
| [Date] | [e.g., "Introduced weekly digest email"] | [Week 12+] | [+3 pp D30 retention] | [Pending / +X pp] |
Filled Example: SaaS Weekly Cohort Analysis
Cohort Definition
| Field | Details |
|---|---|
| Analysis Name | TaskFlow Q1 2026 Weekly Signup Cohort Retention |
| Cohort Type | Time-based (weekly signup) |
| Grouping | ISO week of account creation |
| Retention Event | Completed at least 1 task (created + marked done) in the given time window |
| Time Windows | D1, D7, D14, D30, D60 |
| Data Source | Amplitude cohort analysis + BigQuery verification |
| Exclusions | Internal test accounts (@taskflow.io emails), accounts deleted within 24h |
Retention Grid
| Cohort | Size | D1 | D7 | D14 | D30 | D60 |
|---|---|---|---|---|---|---|
| Jan 6 (Wk 1) | 482 | 38% | 26% | 21% | 16% | 13% |
| Jan 13 (Wk 2) | 511 | 40% | 27% | 22% | 17% | 13% |
| Jan 20 (Wk 3) | 498 | 39% | 28% | 23% | 18% | 14% |
| Jan 27 (Wk 4) | 523 | 41% | 29% | 24% | 18% | |
| Feb 3 (Wk 5) | 547 | 42% | 30% | 25% | 19% | |
| Feb 10 (Wk 6) | 539 | 43% | 31% | 25% | ||
| Feb 17 (Wk 7) | 562 | 48% | 36% | |||
| Feb 24 (Wk 8) | 571 | 49% | 37% | |||
| Mar 3 (Wk 9) | 589 | 51% |
Key finding: D1 retention jumped from 43% to 48% starting in Week 7 (Feb 17). This aligns with the guided onboarding wizard launch on Feb 14. The lift is holding through Week 9.
Behavioral Segment Analysis
| Segment | Definition | Size (Wk 7-9) | D7 Retention | vs. Average |
|---|---|---|---|---|
| Power onboarders | All 4 onboarding steps completed in <1 hour | 412 (24%) | 52% | +16 pp |
| Inviters | Invited 1+ teammate in first 7 days | 318 (18%) | 48% | +12 pp |
| Solo completers | Completed tasks but never invited anyone | 621 (36%) | 31% | -5 pp |
| Abandoned onboarding | Dropped off during onboarding | 371 (22%) | 8% | -28 pp |
Insight: Users who complete onboarding AND invite a teammate retain at 52% D7. Users who abandon onboarding retain at 8%. The 44 pp gap suggests the highest-impact investment is reducing onboarding abandonment, not adding features for retained users.
Key Takeaways
- Group users by signup period and track a meaningful retention event at standard time windows
- Read retention grids down columns (improving over time?) and across rows (where is the drop-off?)
- Behavioral segmentation reveals which first-week actions predict long-term retention
- Track product changes against cohort boundaries to measure impact
- Run cohort analysis at a consistent cadence and compare trends over 6-8 cohorts minimum
About This Template
Created by: Tim Adair
Last Updated: 3/4/2026
Version: 1.0.0
License: Free for personal and commercial use
