What This Template Is For
Most product teams treat discovery as something that happens between sprints, whenever someone finds time. The result: vague customer signals, untested assumptions, and features that ship based on opinion rather than evidence. A discovery sprint fixes this by compressing the entire discover-validate-decide cycle into five focused days.
This template gives you a day-by-day plan for running a structured discovery sprint. By Friday, your team will have interviewed real users, mapped assumptions to evidence, tested at least one solution concept, and made a clear go/no-go decision. It works for new product bets, risky features, or any problem where the team lacks confidence about what to build.
The sprint is designed for a product trio (PM, designer, engineer) but scales to 5-6 people. If you are building a continuous discovery practice around weekly touchpoints, the Product Discovery Weekly Sync Template is a better fit. This template is for when you need to go deep on a single problem in a compressed timeframe.
When to Use This Template
- Before committing engineering resources to a new feature. You have a hypothesis about what to build, but the team has not talked to users about the specific problem. A discovery sprint validates (or kills) the idea before a single story is written.
- When stakeholders disagree about the problem. A VP says users want X, support says users complain about Y, and data tells a third story. Five days of structured discovery replaces opinion with evidence.
- At the start of a new product initiative. You got the green light for a 0-to-1 product or a major new capability. Use a discovery sprint to narrow the problem space before scoping the solution.
- When a launched feature is underperforming. Adoption is below target and the team does not know why. A discovery sprint surfaces the gap between what you built and what users actually need.
- During quarterly planning. Run a discovery sprint on the highest-uncertainty bet to de-risk it before committing a full quarter of engineering time.
How to Use This Template
Step 1: Define the scope
Pick one problem or opportunity to investigate. Write a single question the sprint must answer by Friday. Broad questions ("What should we build next?") produce broad answers. Narrow questions ("Why do enterprise users abandon onboarding at step 3?") produce actionable answers.
Step 2: Recruit participants before the sprint starts
You need 5-8 interviews across the week. Start recruiting at least one week before the sprint. Use your existing user base, support tickets, or a tool like UserTesting or Respondent. For B2B, ask your CS team to connect you with accounts that match your target profile.
Step 3: Block the full week for the product trio
The PM, designer, and one engineer need their calendars cleared Monday through Friday. Half-attention discovery produces half-useful results. Cancel recurring meetings. Set Slack to DND. Treat this like a delivery sprint: the team is heads-down.
Step 4: Run the sprint day by day
Follow the daily structure below. Each day has a morning block (2-3 hours) and an afternoon block (2-3 hours). Adjust timing to fit your team's schedule, but do not skip or reorder days. The sequence matters because each day builds on the previous day's outputs.
Step 5: Make the decision on Friday
The sprint ends with a go/no-go decision. Not a "let's think about it" decision. A real commitment: build it, kill it, or run another sprint to investigate a specific open question.
The Template
Pre-Sprint Checklist (Complete Before Monday)
- ☐ Sprint question defined: Write a single question the sprint must answer
- ☐ Sponsor identified: Name the stakeholder who will act on the sprint's output
- ☐ Trio committed: PM, designer, and engineer have the full week blocked
- ☐ Interviews scheduled: 5-8 users confirmed across Monday through Wednesday
- ☐ Interview guide drafted: 10-15 open-ended questions aligned to the sprint question
- ☐ Existing data gathered: Pull relevant analytics, support tickets, NPS comments, and prior research
- ☐ Assumptions listed: Write down the team's current assumptions about the problem and solution (you will test these)
- ☐ Room booked: A dedicated space (physical or virtual) with a whiteboard for the full week
- ☐ Supplies ready: Sticky notes, markers, timer, recording tool (Grain, Dovetail, or Zoom recording)
Day 1 (Monday): Frame the Problem
Morning (2.5 hours)
- ☐ Kickoff (30 min): Review the sprint question, ground rules, and schedule for the week
- ☐ Stakeholder download (45 min): Hear from the sprint sponsor and any subject-matter experts on what they know and what they need to learn
- ☐ Data walk (45 min): Review existing analytics, support tickets, and prior research as a team. Identify patterns and contradictions
- ☐ Assumption map (30 min): List every assumption the team is making about the user, the problem, and the solution. Plot each on a 2x2 matrix: importance (x-axis) vs. evidence level (y-axis). High-importance, low-evidence assumptions are your sprint targets
Afternoon (2.5 hours)
- ☐ Conduct 2 customer interviews (90 min): Focus on understanding the problem. Do not pitch solutions. Record every session
- ☐ Debrief interviews (30 min): Each team member writes their top 3 observations on sticky notes. Cluster and discuss
- ☐ Update assumption map (30 min): Mark which assumptions were supported or challenged by today's interviews
Day 1 output: Prioritized assumption map, clustered interview themes, updated sprint question (if needed)
Day 2 (Tuesday): Deepen Understanding
Morning (2.5 hours)
- ☐ Conduct 2 customer interviews (90 min): Dig into the patterns from Monday. Ask follow-up questions on emerging themes
- ☐ Debrief and cluster (60 min): Add new observations to the wall. Look for repeated pain points, workarounds, and unmet needs
Afternoon (2.5 hours)
- ☐ Opportunity mapping (60 min): Use an Opportunity Solution Tree to map the problem space. Place customer needs and pain points as opportunities branching from your desired outcome
- ☐ "How Might We" generation (45 min): Write 15-20 HMW questions based on interview insights. Each HMW reframes a pain point as a design challenge
- ☐ Vote on top opportunities (30 min): Each team member gets 3 votes. Identify the 2-3 highest-signal opportunities to explore on Wednesday
- ☐ Conduct 1 more interview if possible (optional)
Day 2 output: Opportunity Solution Tree, ranked HMW questions, 2-3 target opportunities for solution exploration
Day 3 (Wednesday): Generate and Select Solutions
Morning (2.5 hours)
- ☐ Conduct 1-2 final interviews (60-90 min): Targeted questions on the top opportunities. Validate that the pain points are real and frequent
- ☐ Solution sketching (60 min): Each team member independently sketches 2-3 solution concepts for the top opportunity. No discussion during sketching. Use the Crazy 8s method: fold paper into 8 panels, sketch 8 variations in 8 minutes, then pick your best 2-3
Afternoon (2.5 hours)
- ☐ Solution review (60 min): Each person presents their sketches. No critique during presentations. After all sketches are shared, the team discusses strengths, risks, and feasibility
- ☐ Dot voting (15 min): Each team member gets 5 votes. Place dots on the solutions (or parts of solutions) with the highest potential
- ☐ Storyboard the winning concept (75 min): Combine the best elements into a single storyboard: 6-8 frames showing the user journey from trigger to outcome. This becomes the blueprint for Thursday's prototype
Day 3 output: Winning solution storyboard, documented decision rationale, list of riskiest assumptions in the proposed solution
Day 4 (Thursday): Build and Prepare
Morning (3 hours)
- ☐ Build a testable prototype: This is not a production-ready product. Build the minimum artifact needed to get a reaction from users. Options:
- Figma clickthrough (interactive screens, no backend)
- Wizard of Oz prototype (fake the backend, real frontend)
- Slide deck walkthrough (narrated screenshots with decision points)
- Landing page with a value proposition and fake CTA
- Spreadsheet or form that simulates the workflow
- ☐ Write the test script (30 min): 5-7 tasks for users to attempt with the prototype. Include observation prompts: "What do you expect to happen when you click this?" and "Is anything missing?"
Afternoon (2 hours)
- ☐ Pilot test with 1 internal user (45 min): Run the prototype test script with someone outside the sprint team. Fix confusing flows or broken interactions before real users see it
- ☐ Refine the prototype (60 min): Fix issues surfaced in the pilot. Do not add new features. Polish only what blocks the test
- ☐ Confirm Friday interview schedule: Reconfirm 3-5 users for validation interviews tomorrow
Day 4 output: Testable prototype, finalized test script, pilot test feedback incorporated
Day 5 (Friday): Validate and Decide
Morning (3 hours)
- ☐ Run 3-5 prototype validation sessions (2.5 hours): One team member facilitates, one takes notes, one observes. Rotate roles between sessions. Record everything. After each session, have each observer independently note:
- Did the user understand the concept? (Yes / Partially / No)
- Did the user find value? (Yes / Partially / No)
- What confused them?
- What excited them?
- Would they use this? (Definite yes / Maybe / No)
Afternoon (2 hours)
- ☐ Debrief all sessions (45 min): Create a summary grid. For each user: concept clarity, perceived value, willingness to use, top concern. Look for patterns across all 3-5 sessions
- ☐ Score against assumptions (30 min): Return to Monday's assumption map. For each critical assumption, write: Validated / Invalidated / Inconclusive. Be honest
- ☐ Make the decision (30 min): Based on evidence collected across the week, choose one:
- Go: Strong signal. Move to delivery. Write the first user stories
- Pivot: The problem is real but the solution needs rethinking. Scope a focused follow-up sprint
- Kill: Insufficient evidence of a real problem or viable solution. Redirect resources
- Investigate further: One or two critical questions remain unanswered. Run a targeted mini-sprint (2-3 days) on the specific gap
- ☐ Write the sprint summary (15 min): Fill in the Sprint Summary template below
Sprint Summary
| Field | Details |
|---|---|
| Sprint question | [The question you set out to answer] |
| Date | [Monday date] - [Friday date] |
| Team | [Names and roles] |
| Users interviewed | [Count and profile, e.g. "6 enterprise PMs at Series B-D SaaS companies"] |
| Decision | [Go / Pivot / Kill / Investigate further] |
Key findings:
- [Finding 1 with supporting evidence]
- [Finding 2 with supporting evidence]
- [Finding 3 with supporting evidence]
Assumptions validated:
- [Assumption]: [Evidence]
Assumptions invalidated:
- [Assumption]: [Evidence]
Next steps:
- ☐ [Action item 1 with owner and deadline]
- ☐ [Action item 2 with owner and deadline]
- ☐ [Action item 3 with owner and deadline]
Filled Example: SaaS Onboarding Drop-Off Discovery Sprint
Sprint Context
A B2B SaaS company noticed that 40% of trial users abandoned onboarding at the "invite teammates" step. The product team assumed users found the step confusing, but they had no direct evidence. They ran a discovery sprint to understand why.
Sprint question: Why do trial users skip or abandon the "invite teammates" step, and what would make them complete it?
Day 1-2 Highlights
The team interviewed 6 trial users who had dropped off at this step. Three patterns emerged:
- Solo evaluators did not have teammates to invite yet. They were testing the product alone before recommending it to their team.
- Two users worried about "spamming" colleagues with an invite before they could explain what the product did.
- One user skipped it because the step felt mandatory and they wanted to explore the product first.
The assumption that "users find the step confusing" was invalidated. The real issue was timing: users were not ready to invite teammates during initial setup.
Day 3 Solution
The team sketched three approaches: (A) make the step optional with a "Skip for now" button, (B) move the invite step to after the first "aha moment" (creating a project), and (C) give solo evaluators a different onboarding path. They selected a combination of A and B.
Day 4-5 Prototype and Validation
They built a Figma prototype showing onboarding with the invite step moved to after creating a first project, plus a persistent "Invite your team" prompt in the sidebar. Five users tested it:
| User | Concept Clear? | Found Value? | Would Use? | Top Concern |
|---|---|---|---|---|
| User 1 (Solo evaluator) | Yes | Yes | Definite yes | None |
| User 2 (Team lead) | Yes | Yes | Definite yes | "Will they get the same view I set up?" |
| User 3 (Solo evaluator) | Yes | Yes | Maybe | Wanted to see the invite email before sending |
| User 4 (Team lead) | Yes | Yes | Definite yes | None |
| User 5 (Solo evaluator) | Partially | Yes | Maybe | Did not notice the sidebar prompt |
Decision: Go. Move the invite step to post-first-project. Add a preview of the invite email. Make the sidebar prompt more visible. Estimated 2-week delivery sprint.
Tips for Success
- Recruit more users than you need. Schedule 8 interviews knowing 1-2 will cancel. Running Friday's validation with only 2 users weakens your evidence.
- Separate problem interviews from solution interviews. Monday through Wednesday: understand the problem. Thursday/Friday: test solutions. Mixing these produces muddled insights.
- Time-box ruthlessly. Each activity has a time limit. When time is up, move on. Perfectionism in discovery is a trap. You are optimizing for signal, not certainty.
- Record every interview. Memory is unreliable. Recordings let you revisit exact quotes when writing user stories. Tools like Dovetail or Grain make tagging and retrieval fast.
- Include an engineer. Engineers hear user problems differently than PMs and designers. They spot feasibility issues early and propose solutions that are cheaper to build. If your engineer cannot attend the full week, prioritize Wednesday (solution sketching) and Friday (validation).
Common Mistakes
- Asking leading questions in interviews. "Don't you think this feature would be useful?" is not a discovery question. Ask "Tell me about the last time you faced this problem" and "What did you do about it?" The customer interview guide covers question framing in detail.
- Falling in love with your first solution. Day 3 exists to generate multiple options before converging. Teams that skip straight to building their initial idea miss better solutions that emerge from sketching.
- Treating the prototype as a product. The Day 4 prototype is a disposable learning tool. Spending 8 hours making it pixel-perfect is 6 hours wasted. If a user can understand the concept and give feedback, the prototype is good enough.
- Skipping the decision. The whole point of the sprint is to make a call on Friday. "We need to think about it more" is not a valid outcome. If the evidence is mixed, say so and define exactly what additional evidence you need and how you will get it.
About This Template
Created by: Tim Adair
Last Updated: 4/3/2026
Version: 1.0.0
License: Free for personal and commercial use