Skip to main content
New: Deck Doctor. Upload your deck, get CPO-level feedback. 7-day free trial.
TemplateFREE⏱️ 1-2 hours (setup); 30 minutes per analysis cycle

Cohort Analysis Template for Product Analytics

A cohort analysis template for product teams. Covers cohort definition, retention curve setup, behavioral segmentation, and analysis framework with a...

Last updated 2026-03-04
Cohort Analysis Template for Product Analytics preview

Cohort Analysis Template for Product Analytics

Free Cohort Analysis Template for Product Analytics — open and start using immediately

or use email

Instant access. No spam.

Get Template Pro — all templates, no gates, premium files

888+ templates without email gates, plus 30 premium Excel spreadsheets with formulas and professional slide decks. One payment, lifetime access.

Need a custom version?

Forge AI generates PM documents customized to your product, team, and goals. Get a draft in seconds, then refine with AI chat.

Generate with Forge AI

What This Template Is For

Cohort analysis groups users by a shared characteristic (usually signup date) and tracks their behavior over time. It answers questions that aggregate metrics hide: "Is our October cohort retaining better than our August cohort?" and "Did the onboarding change actually improve long-term retention, or just short-term activation?"

Without cohort analysis, improving metrics is a guessing game. Your overall retention rate might be 40%, but that number blends users who signed up three years ago (and are deeply embedded) with users who signed up last week (and may never return). Cohort analysis separates these groups so you can see whether your product is genuinely getting better at retaining new users.

This template covers two types of cohort analysis: time-based (grouped by signup week or month) and behavioral (grouped by actions taken). It includes retention curve setup, cohort retention curve formatting, segmentation criteria, and an analysis framework. For the strategic context, the Product Analytics Handbook covers retention measurement in depth. The glossary entry on cohort analysis provides a quick primer on the concept. To calculate retention rates from your cohort data, the day-7 retention and day-30 retention metric guides explain standard calculation methods.


How to Use This Template

  1. Define your cohort grouping. For most SaaS products, start with weekly signup cohorts. Monthly cohorts work for lower-volume products.
  2. Define the retention event. This is the action that counts as "retained." It should be a core action, not a passive one. Logging in is too passive. Completing a task, sending a message, or creating a document are better.
  3. Set your time windows. Standard windows are D1, D7, D14, D30, D60, D90. Choose windows that match your product's natural usage cadence.
  4. Pull the data and fill in the retention grid. Each cell shows the % of the cohort that performed the retention event in that time window.
  5. Look for patterns: is retention improving cohort-over-cohort? Is there a consistent drop-off point? Do certain behavioral segments retain better?
  6. Run the analysis at a consistent cadence (weekly or monthly) and compare trends over time.

The Template

Cohort Definition

FieldDetails
Analysis Name[e.g., "Q1 2026 Weekly Signup Cohort Retention"]
Cohort Type[Time-based / Behavioral]
Grouping[Weekly signup / Monthly signup / Feature adoption / Plan type]
Retention Event[The specific action that counts as "retained," e.g., "completed at least 1 task"]
Time Windows[e.g., D1, D7, D14, D30, D60, D90]
Data Source[Analytics tool and dataset]
Exclusions[e.g., "Exclude internal test accounts, exclude users who signed up and cancelled within 24 hours"]

Time-Based Retention Grid

Each row is a cohort (grouped by signup period). Each column is a time window. Cells show the % of users from that cohort who performed the retention event in that window.

CohortSizeD1D7D14D30D60D90
[Week 1][N][%][%][%][%][%][%]
[Week 2][N][%][%][%][%][%]
[Week 3][N][%][%][%][%]
[Week 4][N][%][%][%]
[Week 5][N][%][%]
[Week 6][N][%]

Reading the grid:

  • Read down columns to see if retention is improving over time (newer cohorts vs. older)
  • Read across rows to see the retention curve shape for each cohort
  • Empty cells mean insufficient time has passed for that cohort to reach that window

Retention Curve Shape Analysis

Plot the average retention curve and identify the shape:

ShapeWhat It MeansAction
Steep early drop, then flatUsers who survive the first week tend to stay. Onboarding is the problem.Improve activation and onboarding
Gradual decline, no flatteningProduct is not forming a habit. Users drift away over time.Increase engagement loops and re-engagement triggers
Flat from day 1Very high retention. Either a sticky product or a biased sample.Validate the retention event is meaningful, not trivially easy
Staircase dropsRetention drops at billing cycles (D30, D60).Improve perceived value before renewal and reduce involuntary churn

Behavioral Cohort Segmentation

Beyond time-based cohorts, segment users by actions they took (or did not take) during a defined window (typically the first 7 or 14 days).

SegmentDefinitionCohort SizeD30 RetentionD60 Retention
[e.g., "Power onboarders"][Completed all onboarding steps in first 24 hours][N][%][%]
[e.g., "Inviters"][Invited at least 1 teammate in first 7 days][N][%][%]
[e.g., "Solo users"][Never invited a teammate in first 30 days][N][%][%]
[e.g., "Feature explorers"][Used 3+ distinct features in first 7 days][N][%][%]
[e.g., "Single-feature users"][Used only 1 feature in first 7 days][N][%][%]

Key question: Which first-week behaviors predict 90-day retention? The behavioral segments with the highest D90 retention contain the actions you should be nudging all new users toward.


Analysis Framework

For each analysis cycle, answer these five questions:

  • Is retention improving? Compare the latest cohort's D7/D14/D30 to the same window in cohorts from 4, 8, and 12 weeks ago.
  • Where is the biggest drop-off? Identify the time window with the steepest decline. This is your highest-impact improvement opportunity.
  • Which behavioral segment retains best? Identify the first-week behaviors that correlate with high long-term retention.
  • Did recent changes move the needle? Compare cohorts before and after a specific product change (new feature, onboarding redesign, pricing change).
  • Are there segment-specific issues? Break retention by plan type, acquisition channel, company size, or geography to find hidden problems.

Tracking Changes Over Time

DateChangeAffected CohortsExpected ImpactActual Impact
[Date][e.g., "Launched guided onboarding"][Week 10+][+5 pp D7 retention][Pending / +X pp]
[Date][e.g., "Introduced weekly digest email"][Week 12+][+3 pp D30 retention][Pending / +X pp]

Filled Example: SaaS Weekly Cohort Analysis

Cohort Definition

FieldDetails
Analysis NameTaskFlow Q1 2026 Weekly Signup Cohort Retention
Cohort TypeTime-based (weekly signup)
GroupingISO week of account creation
Retention EventCompleted at least 1 task (created + marked done) in the given time window
Time WindowsD1, D7, D14, D30, D60
Data SourceAmplitude cohort analysis + BigQuery verification
ExclusionsInternal test accounts (@taskflow.io emails), accounts deleted within 24h

Retention Grid

CohortSizeD1D7D14D30D60
Jan 6 (Wk 1)48238%26%21%16%13%
Jan 13 (Wk 2)51140%27%22%17%13%
Jan 20 (Wk 3)49839%28%23%18%14%
Jan 27 (Wk 4)52341%29%24%18%
Feb 3 (Wk 5)54742%30%25%19%
Feb 10 (Wk 6)53943%31%25%
Feb 17 (Wk 7)56248%36%
Feb 24 (Wk 8)57149%37%
Mar 3 (Wk 9)58951%

Key finding: D1 retention jumped from 43% to 48% starting in Week 7 (Feb 17). This aligns with the guided onboarding wizard launch on Feb 14. The lift is holding through Week 9.

Behavioral Segment Analysis

SegmentDefinitionSize (Wk 7-9)D7 Retentionvs. Average
Power onboardersAll 4 onboarding steps completed in <1 hour412 (24%)52%+16 pp
InvitersInvited 1+ teammate in first 7 days318 (18%)48%+12 pp
Solo completersCompleted tasks but never invited anyone621 (36%)31%-5 pp
Abandoned onboardingDropped off during onboarding371 (22%)8%-28 pp

Insight: Users who complete onboarding AND invite a teammate retain at 52% D7. Users who abandon onboarding retain at 8%. The 44 pp gap suggests the highest-impact investment is reducing onboarding abandonment, not adding features for retained users.

Key Takeaways

  • Group users by signup period and track a meaningful retention event at standard time windows
  • Read retention grids down columns (improving over time?) and across rows (where is the drop-off?)
  • Behavioral segmentation reveals which first-week actions predict long-term retention
  • Track product changes against cohort boundaries to measure impact
  • Run cohort analysis at a consistent cadence and compare trends over 6-8 cohorts minimum

About This Template

Created by: Tim Adair

Last Updated: 3/4/2026

Version: 1.0.0

License: Free for personal and commercial use

Frequently Asked Questions

What is the right retention event to track?+
The retention event should measure whether the user got value from the product, not just whether they showed up. For a project management tool, "completed a task" is better than "logged in." For a messaging app, "sent a message" is better than "opened the app." The test: if a user performed only this action and nothing else, would they have gotten value? If yes, it is a good retention event. See the [glossary entry on cohort analysis](/glossary/cohort-analysis) for more on choosing the right event.
Should I use weekly or monthly cohorts?+
Weekly cohorts for products with daily or weekly usage patterns (SaaS tools, productivity apps, social products). Monthly cohorts for products with lower-frequency usage (marketplaces, financial tools, seasonal products). The rule of thumb: your cohort window should be shorter than your natural usage cycle. If users typically return weekly, monthly cohorts will miss important trends.
How many cohorts do I need before drawing conclusions?+
At minimum, 6-8 cohorts of similar size. With fewer, random variation dominates and trends are unreliable. For behavioral segments, each segment needs at least 100 users per cohort to be statistically meaningful. If segment sizes are too small, widen your cohort window (e.g., monthly instead of weekly).
How do I compare "before and after" a product change?+
Pick a clean cutoff date (the date the change shipped to 100% of users). Cohorts before that date are the "before" group; cohorts after are the "after" group. Compare the same retention windows (D7, D30) across groups. Allow at least 3-4 post-change cohorts before concluding the change worked, to account for novelty effects and seasonality.
What is the difference between cohort retention and overall retention rate?+
Overall retention blends all users regardless of when they signed up. It is slow-moving and hides trends. Cohort retention isolates each signup group, so you can see whether your product is getting better at retaining new users over time. A product can have declining overall retention (because of a growing base of churned users) while having improving cohort retention (because each new cohort retains better than the last). Cohort retention is the metric that tells you whether your improvements are working. ---

Explore More Templates

Browse our full library of PM templates, or generate a custom version with AI.

Free PDF

Like This Template?

Subscribe to get new templates, frameworks, and PM strategies delivered to your inbox.

or use email

Join 10,000+ product leaders. Instant PDF download.

Want full SaaS idea playbooks with market research?

Explore Ideas Pro →