Skip to main content
New: Deck Doctor. Upload your deck, get CPO-level feedback. 7-day free trial.
NewFREE⏱️ 1 week

Product Discovery Sprint Template

Free 5-day discovery sprint template for product managers. Run a structured week of customer interviews, assumption testing, and solution validation.

Last updated 2026-04-03

What This Template Is For

Most product teams treat discovery as something that happens between sprints, whenever someone finds time. The result: vague customer signals, untested assumptions, and features that ship based on opinion rather than evidence. A discovery sprint fixes this by compressing the entire discover-validate-decide cycle into five focused days.

This template gives you a day-by-day plan for running a structured discovery sprint. By Friday, your team will have interviewed real users, mapped assumptions to evidence, tested at least one solution concept, and made a clear go/no-go decision. It works for new product bets, risky features, or any problem where the team lacks confidence about what to build.

The sprint is designed for a product trio (PM, designer, engineer) but scales to 5-6 people. If you are building a continuous discovery practice around weekly touchpoints, the Product Discovery Weekly Sync Template is a better fit. This template is for when you need to go deep on a single problem in a compressed timeframe.

When to Use This Template

  • Before committing engineering resources to a new feature. You have a hypothesis about what to build, but the team has not talked to users about the specific problem. A discovery sprint validates (or kills) the idea before a single story is written.
  • When stakeholders disagree about the problem. A VP says users want X, support says users complain about Y, and data tells a third story. Five days of structured discovery replaces opinion with evidence.
  • At the start of a new product initiative. You got the green light for a 0-to-1 product or a major new capability. Use a discovery sprint to narrow the problem space before scoping the solution.
  • When a launched feature is underperforming. Adoption is below target and the team does not know why. A discovery sprint surfaces the gap between what you built and what users actually need.
  • During quarterly planning. Run a discovery sprint on the highest-uncertainty bet to de-risk it before committing a full quarter of engineering time.

How to Use This Template

Step 1: Define the scope

Pick one problem or opportunity to investigate. Write a single question the sprint must answer by Friday. Broad questions ("What should we build next?") produce broad answers. Narrow questions ("Why do enterprise users abandon onboarding at step 3?") produce actionable answers.

Step 2: Recruit participants before the sprint starts

You need 5-8 interviews across the week. Start recruiting at least one week before the sprint. Use your existing user base, support tickets, or a tool like UserTesting or Respondent. For B2B, ask your CS team to connect you with accounts that match your target profile.

Step 3: Block the full week for the product trio

The PM, designer, and one engineer need their calendars cleared Monday through Friday. Half-attention discovery produces half-useful results. Cancel recurring meetings. Set Slack to DND. Treat this like a delivery sprint: the team is heads-down.

Step 4: Run the sprint day by day

Follow the daily structure below. Each day has a morning block (2-3 hours) and an afternoon block (2-3 hours). Adjust timing to fit your team's schedule, but do not skip or reorder days. The sequence matters because each day builds on the previous day's outputs.

Step 5: Make the decision on Friday

The sprint ends with a go/no-go decision. Not a "let's think about it" decision. A real commitment: build it, kill it, or run another sprint to investigate a specific open question.


The Template

Pre-Sprint Checklist (Complete Before Monday)

  • Sprint question defined: Write a single question the sprint must answer
  • Sponsor identified: Name the stakeholder who will act on the sprint's output
  • Trio committed: PM, designer, and engineer have the full week blocked
  • Interviews scheduled: 5-8 users confirmed across Monday through Wednesday
  • Interview guide drafted: 10-15 open-ended questions aligned to the sprint question
  • Existing data gathered: Pull relevant analytics, support tickets, NPS comments, and prior research
  • Assumptions listed: Write down the team's current assumptions about the problem and solution (you will test these)
  • Room booked: A dedicated space (physical or virtual) with a whiteboard for the full week
  • Supplies ready: Sticky notes, markers, timer, recording tool (Grain, Dovetail, or Zoom recording)

Day 1 (Monday): Frame the Problem

Morning (2.5 hours)

  • Kickoff (30 min): Review the sprint question, ground rules, and schedule for the week
  • Stakeholder download (45 min): Hear from the sprint sponsor and any subject-matter experts on what they know and what they need to learn
  • Data walk (45 min): Review existing analytics, support tickets, and prior research as a team. Identify patterns and contradictions
  • Assumption map (30 min): List every assumption the team is making about the user, the problem, and the solution. Plot each on a 2x2 matrix: importance (x-axis) vs. evidence level (y-axis). High-importance, low-evidence assumptions are your sprint targets

Afternoon (2.5 hours)

  • Conduct 2 customer interviews (90 min): Focus on understanding the problem. Do not pitch solutions. Record every session
  • Debrief interviews (30 min): Each team member writes their top 3 observations on sticky notes. Cluster and discuss
  • Update assumption map (30 min): Mark which assumptions were supported or challenged by today's interviews

Day 1 output: Prioritized assumption map, clustered interview themes, updated sprint question (if needed)


Day 2 (Tuesday): Deepen Understanding

Morning (2.5 hours)

  • Conduct 2 customer interviews (90 min): Dig into the patterns from Monday. Ask follow-up questions on emerging themes
  • Debrief and cluster (60 min): Add new observations to the wall. Look for repeated pain points, workarounds, and unmet needs

Afternoon (2.5 hours)

  • Opportunity mapping (60 min): Use an Opportunity Solution Tree to map the problem space. Place customer needs and pain points as opportunities branching from your desired outcome
  • "How Might We" generation (45 min): Write 15-20 HMW questions based on interview insights. Each HMW reframes a pain point as a design challenge
  • Vote on top opportunities (30 min): Each team member gets 3 votes. Identify the 2-3 highest-signal opportunities to explore on Wednesday
  • Conduct 1 more interview if possible (optional)

Day 2 output: Opportunity Solution Tree, ranked HMW questions, 2-3 target opportunities for solution exploration


Day 3 (Wednesday): Generate and Select Solutions

Morning (2.5 hours)

  • Conduct 1-2 final interviews (60-90 min): Targeted questions on the top opportunities. Validate that the pain points are real and frequent
  • Solution sketching (60 min): Each team member independently sketches 2-3 solution concepts for the top opportunity. No discussion during sketching. Use the Crazy 8s method: fold paper into 8 panels, sketch 8 variations in 8 minutes, then pick your best 2-3

Afternoon (2.5 hours)

  • Solution review (60 min): Each person presents their sketches. No critique during presentations. After all sketches are shared, the team discusses strengths, risks, and feasibility
  • Dot voting (15 min): Each team member gets 5 votes. Place dots on the solutions (or parts of solutions) with the highest potential
  • Storyboard the winning concept (75 min): Combine the best elements into a single storyboard: 6-8 frames showing the user journey from trigger to outcome. This becomes the blueprint for Thursday's prototype

Day 3 output: Winning solution storyboard, documented decision rationale, list of riskiest assumptions in the proposed solution


Day 4 (Thursday): Build and Prepare

Morning (3 hours)

  • Build a testable prototype: This is not a production-ready product. Build the minimum artifact needed to get a reaction from users. Options:

- Figma clickthrough (interactive screens, no backend)

- Wizard of Oz prototype (fake the backend, real frontend)

- Slide deck walkthrough (narrated screenshots with decision points)

- Landing page with a value proposition and fake CTA

- Spreadsheet or form that simulates the workflow

  • Write the test script (30 min): 5-7 tasks for users to attempt with the prototype. Include observation prompts: "What do you expect to happen when you click this?" and "Is anything missing?"

Afternoon (2 hours)

  • Pilot test with 1 internal user (45 min): Run the prototype test script with someone outside the sprint team. Fix confusing flows or broken interactions before real users see it
  • Refine the prototype (60 min): Fix issues surfaced in the pilot. Do not add new features. Polish only what blocks the test
  • Confirm Friday interview schedule: Reconfirm 3-5 users for validation interviews tomorrow

Day 4 output: Testable prototype, finalized test script, pilot test feedback incorporated


Day 5 (Friday): Validate and Decide

Morning (3 hours)

  • Run 3-5 prototype validation sessions (2.5 hours): One team member facilitates, one takes notes, one observes. Rotate roles between sessions. Record everything. After each session, have each observer independently note:

- Did the user understand the concept? (Yes / Partially / No)

- Did the user find value? (Yes / Partially / No)

- What confused them?

- What excited them?

- Would they use this? (Definite yes / Maybe / No)

Afternoon (2 hours)

  • Debrief all sessions (45 min): Create a summary grid. For each user: concept clarity, perceived value, willingness to use, top concern. Look for patterns across all 3-5 sessions
  • Score against assumptions (30 min): Return to Monday's assumption map. For each critical assumption, write: Validated / Invalidated / Inconclusive. Be honest
  • Make the decision (30 min): Based on evidence collected across the week, choose one:

- Go: Strong signal. Move to delivery. Write the first user stories

- Pivot: The problem is real but the solution needs rethinking. Scope a focused follow-up sprint

- Kill: Insufficient evidence of a real problem or viable solution. Redirect resources

- Investigate further: One or two critical questions remain unanswered. Run a targeted mini-sprint (2-3 days) on the specific gap

  • Write the sprint summary (15 min): Fill in the Sprint Summary template below

Sprint Summary

FieldDetails
Sprint question[The question you set out to answer]
Date[Monday date] - [Friday date]
Team[Names and roles]
Users interviewed[Count and profile, e.g. "6 enterprise PMs at Series B-D SaaS companies"]
Decision[Go / Pivot / Kill / Investigate further]

Key findings:

  1. [Finding 1 with supporting evidence]
  2. [Finding 2 with supporting evidence]
  3. [Finding 3 with supporting evidence]

Assumptions validated:

  • [Assumption]: [Evidence]

Assumptions invalidated:

  • [Assumption]: [Evidence]

Next steps:

  • [Action item 1 with owner and deadline]
  • [Action item 2 with owner and deadline]
  • [Action item 3 with owner and deadline]

Filled Example: SaaS Onboarding Drop-Off Discovery Sprint

Sprint Context

A B2B SaaS company noticed that 40% of trial users abandoned onboarding at the "invite teammates" step. The product team assumed users found the step confusing, but they had no direct evidence. They ran a discovery sprint to understand why.

Sprint question: Why do trial users skip or abandon the "invite teammates" step, and what would make them complete it?

Day 1-2 Highlights

The team interviewed 6 trial users who had dropped off at this step. Three patterns emerged:

  1. Solo evaluators did not have teammates to invite yet. They were testing the product alone before recommending it to their team.
  2. Two users worried about "spamming" colleagues with an invite before they could explain what the product did.
  3. One user skipped it because the step felt mandatory and they wanted to explore the product first.

The assumption that "users find the step confusing" was invalidated. The real issue was timing: users were not ready to invite teammates during initial setup.

Day 3 Solution

The team sketched three approaches: (A) make the step optional with a "Skip for now" button, (B) move the invite step to after the first "aha moment" (creating a project), and (C) give solo evaluators a different onboarding path. They selected a combination of A and B.

Day 4-5 Prototype and Validation

They built a Figma prototype showing onboarding with the invite step moved to after creating a first project, plus a persistent "Invite your team" prompt in the sidebar. Five users tested it:

UserConcept Clear?Found Value?Would Use?Top Concern
User 1 (Solo evaluator)YesYesDefinite yesNone
User 2 (Team lead)YesYesDefinite yes"Will they get the same view I set up?"
User 3 (Solo evaluator)YesYesMaybeWanted to see the invite email before sending
User 4 (Team lead)YesYesDefinite yesNone
User 5 (Solo evaluator)PartiallyYesMaybeDid not notice the sidebar prompt

Decision: Go. Move the invite step to post-first-project. Add a preview of the invite email. Make the sidebar prompt more visible. Estimated 2-week delivery sprint.


Tips for Success

  • Recruit more users than you need. Schedule 8 interviews knowing 1-2 will cancel. Running Friday's validation with only 2 users weakens your evidence.
  • Separate problem interviews from solution interviews. Monday through Wednesday: understand the problem. Thursday/Friday: test solutions. Mixing these produces muddled insights.
  • Time-box ruthlessly. Each activity has a time limit. When time is up, move on. Perfectionism in discovery is a trap. You are optimizing for signal, not certainty.
  • Record every interview. Memory is unreliable. Recordings let you revisit exact quotes when writing user stories. Tools like Dovetail or Grain make tagging and retrieval fast.
  • Include an engineer. Engineers hear user problems differently than PMs and designers. They spot feasibility issues early and propose solutions that are cheaper to build. If your engineer cannot attend the full week, prioritize Wednesday (solution sketching) and Friday (validation).

Common Mistakes

  • Asking leading questions in interviews. "Don't you think this feature would be useful?" is not a discovery question. Ask "Tell me about the last time you faced this problem" and "What did you do about it?" The customer interview guide covers question framing in detail.
  • Falling in love with your first solution. Day 3 exists to generate multiple options before converging. Teams that skip straight to building their initial idea miss better solutions that emerge from sketching.
  • Treating the prototype as a product. The Day 4 prototype is a disposable learning tool. Spending 8 hours making it pixel-perfect is 6 hours wasted. If a user can understand the concept and give feedback, the prototype is good enough.
  • Skipping the decision. The whole point of the sprint is to make a call on Friday. "We need to think about it more" is not a valid outcome. If the evidence is mixed, say so and define exactly what additional evidence you need and how you will get it.

About This Template

Created by: Tim Adair

Last Updated: 4/3/2026

Version: 1.0.0

License: Free for personal and commercial use

Frequently Asked Questions

How is this different from a Google Ventures design sprint?+
A GV design sprint focuses on prototyping and testing a single solution concept in five days. A discovery sprint spends more time understanding the problem before generating solutions. Days 1-2 are entirely focused on customer interviews and assumption mapping. This is better suited for situations where the team is not yet sure they are solving the right problem.
Can I run a discovery sprint with a remote team?+
Yes. Use Miro or FigJam for collaborative mapping, Zoom for interviews, and a shared Notion doc for the daily outputs. The critical requirement is that the team works synchronously during the morning and afternoon blocks. Async discovery sprints do not produce the same momentum or quality of insight.
What if I can only find 3 users for the full week?+
Three is the minimum. Interview each person twice: once during problem exploration (Day 1-2) and once during validation (Day 5). You lose the diversity of perspectives that 5-8 users provide, but you still get usable signal. If you cannot find 3 users, postpone the sprint. Running discovery without users is just brainstorming.
How do I convince my manager to block a full week for discovery?+
Frame it in terms of risk and cost. "We are about to commit 3 engineers for 6 weeks to build [feature]. That is $X in salaries. A 1-week discovery sprint costs $Y and tells us whether we are building the right thing before we spend $X." The [RICE framework](/frameworks/rice-framework) can help you quantify the impact and confidence of the proposed feature to strengthen your case.
What do I do with the sprint outputs after Friday?+
The sprint summary goes to the sponsor immediately. If the decision is "Go," the storyboard and interview recordings feed directly into user story writing. If it is "Kill" or "Pivot," write a one-page memo explaining what you learned and archive it in your team's research repository. Learnings from killed ideas prevent the next PM from re-investigating the same dead end. See the [decision log template](/templates/decision-log-template) for capturing the rationale. ---

Explore More Templates

Browse our full library of PM templates, or generate a custom version with AI.

Free PDF

Like This Template?

Subscribe to get new templates, frameworks, and PM strategies delivered to your inbox.

or use email

Join 10,000+ product leaders. Instant PDF download.

Want full SaaS idea playbooks with market research?

Explore Ideas Pro →