Quick Answer (TL;DR)
The Opportunity Solution Tree (OST) is a visual product discovery framework created by Teresa Torres that maps the path from a desired outcome to opportunities (customer needs, pain points, and desires) to solutions (features and ideas) to experiments (tests to validate assumptions). It structures continuous discovery so product teams can make better decisions, avoid building the wrong things, and maintain a clear connection between what they're building and why. The OST is the centerpiece of Torres' Continuous Discovery Habits methodology. For guidance on choosing between discovery frameworks, see our Opportunity Solution Trees vs Assumption Mapping comparison.
What Is an Opportunity Solution Tree?
An Opportunity Solution Tree is a hierarchical visual map that connects your team's desired business outcome to the specific experiments you're running today. It provides a clear, traceable line from strategy to execution.
The tree has four levels:
- Outcome (the root). The measurable business or product outcome you're trying to achieve
- Opportunities (branches). Customer needs, pain points, and desires discovered through research
- Solutions (smaller branches). Ideas for addressing each opportunity
- Experiments (leaves). Tests designed to validate whether each solution will work
The OST was developed by Teresa Torres, a product discovery coach and author of Continuous Discovery Habits. It addresses one of the most persistent problems in product management: the gap between understanding what customers need and deciding what to build. Too often, product teams jump from "customers told us they want X" directly to "let's build X" without exploring the problem space, generating multiple solutions, or testing assumptions.
Why Trees, Not Lists?
Traditional backlogs are flat lists of features. They obscure the reasoning behind each item and make it impossible to see alternative solutions. The tree structure solves this by:
- Making the reasoning visible: you can trace any experiment back through its solution, opportunity, and outcome
- Showing alternatives: each opportunity has multiple possible solutions, reminding the team they have choices
- Preventing pet features: if a solution doesn't connect to an opportunity that connects to an outcome, it doesn't belong on the tree
- Enabling pivoting: if an experiment invalidates a solution, you can move to another solution for the same opportunity without starting over
The Four Levels in Detail
Level 1: Outcome
The outcome is the single, measurable metric your team is trying to move. It's set by leadership (or negotiated between the team and leadership) and defines the team's mission for a given period.
What makes a good outcome:
- It's measurable: "Increase trial-to-paid conversion rate from 8% to 12%"
- It's within the team's influence: The team can directly impact it through product changes
- It's time-bound: Tied to a quarter or other planning period
- It's a lagging indicator of customer value: When customers get more value, the metric improves
Examples of good outcomes:
| Outcome | Why It Works |
|---|---|
| Increase 30-day retention from 65% to 75% | Measurable, directly tied to product value, within team's control |
| Reduce time-to-first-value from 5 days to 2 days | Specific, measurable, directly impacts activation |
| Increase weekly active usage from 3 to 5 sessions | Behavioral metric, tied to engagement and habit formation |
| Reduce support tickets related to billing by 40% | Measurable, specific problem area, clear success criteria |
Examples of poor outcomes:
- "Improve the user experience". Not measurable
- "Increase revenue". Too broad; not within a single team's control
- "Launch feature X". This is an output, not an outcome
- "Make users happier". Not specific or measurable
Level 2: Opportunities
Opportunities are customer needs, pain points, and desires that, if addressed, would move the outcome. They come from research. Primarily customer interviews, but also support tickets, analytics, surveys, and observation.
Opportunities are NOT solutions. This distinction is critical:
| Solution (wrong) | Opportunity (right) |
|---|---|
| "Build a dashboard" | "Users can't quickly see whether their project is on track" |
| "Add Slack integration" | "Users miss important updates because they don't check our app frequently" |
| "Create onboarding wizard" | "New users don't understand what to do first after signing up" |
| "Add search filters" | "Users waste time scrolling through irrelevant results" |
How to discover opportunities:
The primary method is weekly customer interviews. Torres recommends that the product trio (PM, designer, engineer) interviews at least one customer per week, every week, as a sustainable habit. These aren't big, formal research projects. They're lightweight, 30-minute conversations that continuously feed the opportunity space.
Interview structure for opportunity discovery:
- Start with a story prompt (5 minutes): "Tell me about the last time you tried to [activity related to your outcome]."
- Dig into the story (15 minutes): Follow the narrative. "What happened next? What were you thinking at that point? What did you try?"
- Explore pain points and needs (10 minutes): "What was the hardest part? What would have made it easier? What did you wish you had?"
Organizing opportunities on the tree:
Opportunities should be organized hierarchically. Broad opportunity areas break down into more specific sub-opportunities:
Outcome: Increase trial-to-paid conversion from 8% to 12%
āāā Users don't understand the value proposition during trial
ā āāā Users don't know which features to try first
ā āāā Users don't see how the product fits their workflow
ā āāā Users feel overwhelmed by too many options
āāā Users hit technical barriers during setup
ā āāā Data import process is confusing
ā āāā Integration setup requires developer help
ā āāā Users don't have the right data to get started
āāā Users don't experience enough value before the trial ends
āāā Trial period is too short for complex use cases
āāā Users don't complete enough actions to see results
āāā Users compare to free alternatives and don't see the premium value
Level 3: Solutions
Solutions are specific ideas for addressing each opportunity. Each opportunity should have multiple solutions. This is where the tree structure shines. By generating 3-5 solutions per opportunity, you avoid fixating on the first idea and increase your chances of finding the best approach.
Generating solutions:
For each opportunity, the product trio brainstorms multiple possible solutions:
| Opportunity | Solution A | Solution B | Solution C |
|---|---|---|---|
| Users don't know which features to try first | Interactive onboarding checklist | Personalized "getting started" path based on role | "Quick wins" tutorial showing 3 high-value actions |
| Data import process is confusing | Wizard-style guided import | CSV template with pre-populated sample data | One-click import from common tools (Trello, Asana) |
| Users don't experience enough value before trial ends | Extend trial to 30 days | Pre-populate account with sample data to show value | Trigger a "value milestone" email when users complete key actions |
Key principles:
- Generate before evaluating. List 3-5 solutions before discussing which is best.
- Consider small and large. Not every solution needs to be a major feature. Sometimes the best solution is a tooltip, an email, or a copy change.
- Include non-product solutions. Sales playbooks, documentation, customer success interventions, and marketing content are all valid solutions.
Level 4: Experiments
Experiments are tests designed to validate or invalidate the assumptions underlying each solution. This is where the OST prevents teams from building full features only to discover they don't work.
Before designing experiments, identify the assumptions embedded in each solution:
For the solution "Interactive onboarding checklist":
- Desirability assumption: Users will engage with a checklist (they won't ignore it)
- Usability assumption: Users can understand and complete each checklist step
- Feasibility assumption: We can build the checklist within our technical constraints
- Viability assumption: The checklist will improve trial-to-paid conversion enough to justify the investment
Types of experiments:
| Experiment Type | What It Tests | Time to Run | Example |
|---|---|---|---|
| Customer interview | Desirability, understanding | 1-2 days | Show a concept mockup and gauge reaction |
| Prototype test | Usability, desirability | 3-5 days | Build a clickable prototype and observe users |
| Fake door test | Desirability at scale | 1-2 weeks | Add a button for the feature, measure clicks, show "coming soon" |
| Concierge test | Value proposition | 1-2 weeks | Manually deliver the solution to 5-10 users |
| A/B test | Impact on outcome | 2-4 weeks | Build a minimal version and measure conversion impact |
| Wizard of Oz | Full experience feasibility | 1-3 weeks | Simulate the feature with human effort behind the scenes |
Experiment design template:
For each experiment, document:
- Assumption being tested: What specific belief are we validating?
- Method: How will we test it?
- Success criteria: What result would confirm the assumption? What would disprove it?
- Sample size: How many users/data points do we need?
- Timeline: How long will the experiment run?
Building an OST: Step-by-Step
Step 1: Set Your Outcome (1 day)
Work with leadership to define a clear, measurable outcome for your team. If leadership assigns an output ("build feature X"), negotiate it into an outcome ("increase the metric that feature X is supposed to improve").
Step 2: Seed the Opportunity Space (2-4 weeks)
Conduct 6-10 customer interviews focused on stories related to your outcome. Synthesize findings into an initial set of opportunities. Organize them hierarchically on the tree.
Step 3: Prioritize Opportunities (1 day)
You can't pursue every opportunity simultaneously. Evaluate based on:
- Size: How many customers experience this pain point?
- Market factors: Does addressing this create competitive advantage?
- Company factors: Does this align with company strategy?
- Customer factors: How severe is the pain?
Select 1-3 opportunities to focus on for the next few weeks.
Step 4: Generate Solutions (Half day)
For each prioritized opportunity, brainstorm 3-5 solutions with your product trio. Don't evaluate yet. Just generate.
Step 5: Map Assumptions (Half day)
For each solution, identify the key assumptions that must be true for it to work. Categorize them as desirability, usability, feasibility, or viability.
Step 6: Design and Run Experiments (Ongoing)
Start with the riskiest assumptions first. Design lightweight experiments. Run them. Learn. Update the tree based on results.
Step 7: Iterate Continuously
The OST is a living document. Every week:
- Conduct at least one customer interview to discover new opportunities
- Review experiment results and update the tree
- Prune solutions that have been invalidated
- Add new solutions based on what you've learned
Real-World OST Example: Spotify's Podcast Discovery
Here's how Spotify might have structured an OST for improving podcast discovery:
Outcome: Increase podcast listening hours per user from 2 to 4 hours/week
āāā Users don't know what podcasts to listen to
ā āāā Solution: Personalized podcast recommendations based on music taste
ā ā āāā Experiment: Interview 10 users about discovery behavior
ā ā āāā Experiment: A/B test recommendation algorithm on 5% of users
ā āāā Solution: "Daily Podcast Mix" similar to Daily Mix for music
ā ā āāā Experiment: Fake door test on home screen
ā āāā Solution: Social sharing. See what friends are listening to
ā āāā Experiment: Survey 200 users on interest in social podcast features
āāā Users start podcasts but don't finish
ā āāā Solution: Playback speed controls (1.5x, 2x)
ā ā āāā Experiment: Usage data analysis of current speed control adoption
ā āāā Solution: Episode summaries with chapter markers
ā ā āāā Experiment: Prototype test with 8 users
ā āāā Solution: "Continue listening" prominent placement
ā āāā Experiment: A/B test placement on home screen
āāā Users forget to come back for new episodes
āāā Solution: Smart notifications when new episodes drop
ā āāā Experiment: Opt-in notification test with 1,000 users
āāā Solution: Auto-download new episodes from followed podcasts
ā āāā Experiment: Concierge test. Manually send "new episode" reminders
āāā Solution: Weekly podcast digest email
āāā Experiment: Email campaign to 5,000 users, measure re-engagement
Common Mistakes and Pitfalls
1. Starting with Solutions Instead of Opportunities
The most pervasive mistake. Teams put "Build feature X" on the tree without first understanding what customer need it addresses. If you can't articulate the opportunity, the solution doesn't belong on the tree yet.
2. Only One Solution Per Opportunity
If you have exactly one solution for each opportunity, you're not exploring the solution space. This leads to anchoring on the first idea and missing better alternatives. Force yourself to generate at least three solutions before committing.
3. Confusing Outputs with Outcomes
"Launch the new onboarding flow" is an output. "Increase activation rate from 30% to 45%" is an outcome. If your tree starts with an output, the entire structure is compromised because you've already decided on the solution before exploring alternatives.
4. Building the Tree Once and Never Updating It
The OST is a living artifact. It should change weekly as you learn from interviews and experiments. A stale tree is just a pretty diagram. Build the habit of updating the tree every week during your trio sync.
5. Skipping Experiments
Some teams use the OST to organize their thinking but then skip the experiment layer and jump straight to building. This defeats the purpose. The experiment layer is where you de-risk solutions before committing engineering resources.
6. Making the Tree Too Big
A tree with 50 opportunities, 200 solutions, and 400 experiments is unusable. Focus on the 3-5 most important opportunities and the 2-3 most promising solutions for each. Archive the rest. You can always bring them back.
7. Working in Isolation
The OST is designed for the product trio (PM + designer + engineer) to build and maintain together. A tree built by the PM alone misses technical feasibility insights from engineering and usability insights from design.
The Continuous Discovery Habits Behind OST
Teresa Torres' framework isn't just about the tree. It's about the habits that keep the tree alive and useful:
Habit 1: Weekly Customer Interviews
Interview at least one customer per week. Not a big research project. A lightweight, 30-minute conversation. Use an automated recruiting system so interviews happen consistently without manual scheduling overhead each time.
Habit 2: Interview Snapshots
After each interview, create a one-page summary capturing:
- Key quotes (2-3 direct quotes)
- Opportunities identified (pain points, needs, desires)
- Surprises (anything unexpected)
- Quick sketch of the customer's workflow or mental model
Habit 3: Opportunity Mapping
Weekly, review your interview snapshots and update the opportunity space on your tree. Are new opportunities emerging? Are existing opportunities being validated by multiple interviews?
Habit 4: Assumption Testing
Continuously identify and test the riskiest assumptions in your tree. Run small experiments weekly. Not monthly or quarterly. This keeps the learning cycle fast and prevents big bets on unvalidated assumptions.
Habit 5: Trio Decision-Making
The product trio (PM, designer, engineer) makes discovery decisions together. Not the PM deciding and informing the others. Shared understanding leads to better solutions and stronger buy-in.
OST vs. Other Discovery Frameworks
| Factor | OST | Design Thinking | JTBD | Lean Startup | Story Mapping |
|---|---|---|---|---|---|
| Focus | Connecting outcomes to experiments | Creative problem-solving | Customer motivations | Business validation | User journey and delivery planning |
| Cadence | Continuous (weekly) | Project-based | Research-phase | Build-measure-learn cycles | Planning-phase |
| Key artifact | Visual tree | Prototypes | Job maps | MVPs | Story map |
| Team involved | Product trio | Cross-functional | Research team | Founding/product team | Cross-functional |
| Best for | Ongoing discovery alongside delivery | Tackling ambiguous problems | Understanding "why" behind behavior | Validating business models | Planning releases |
| Unique strength | Maintains connection between strategy and execution | Deep user empathy | Reveals hidden motivations | Speed to market learning | Visualization of user experience |
OST + JTBD
JTBD interviews are an excellent way to seed the opportunity space in your OST. JTBD reveals the jobs customers are trying to do and the underserved outcomes within those jobs. These become opportunities on your tree. The JTBD Builder can help structure the job statements you discover.
OST + RICE
Once you've identified opportunities and solutions on your tree, use RICE scoring to prioritize which solutions to pursue first (the RICE Calculator speeds this up). The OST ensures you're scoring the right things; RICE ensures you're sequencing them effectively. The Product Discovery Handbook covers continuous discovery habits in depth.
Best Practices for OST Implementation
Start Small
Don't try to build a complete tree in one session. Start with your outcome and 3-5 opportunities from recent customer conversations. Add to it weekly as you learn more.
Make It Visible
Put your OST on a wall, in Miro, in FigJam, or in a dedicated tool. Make it the first thing the team sees during planning sessions. Visibility keeps the tree alive and relevant.
Use the Tree in Stakeholder Conversations
When a stakeholder asks "Why aren't we building feature X?" walk them through the tree. Show how the outcome connects to opportunities, opportunities to solutions, and how you're testing solutions through experiments. This builds trust in your decision-making process.
Review the Tree Weekly
During your weekly product trio sync:
- Review experiment results from the past week
- Update the tree based on learnings (prune invalidated solutions, add new ones)
- Review new interview insights and update opportunities
- Plan next week's experiments
Track Your Discovery Velocity
Measure how many interviews you're conducting per week, how many experiments you're running, and how many assumptions you're validating. These leading indicators tell you whether your discovery process is healthy.
Connect Discovery to Delivery
The OST bridges discovery and delivery. When an experiment validates a solution, it moves into your delivery backlog with full context: the outcome it serves, the opportunity it addresses, the evidence supporting it, and the assumptions that have been validated. Engineers who understand the "why" build better solutions.
Getting Started with Opportunity Solution Trees
- Form your product trio. PM, designer, and one engineer committed to discovery
- Set a clear outcome tied to a metric your team can influence
- Conduct 4-6 customer interviews to seed your opportunity space
- Build your initial tree with outcome, 3-5 opportunities, and 2-3 solutions per opportunity
- Identify the riskiest assumption in your top-priority solution
- Design and run an experiment to test that assumption this week
- Review and update the tree weekly based on what you learn
- Establish a weekly interview cadence to continuously feed the tree with fresh insights
The Opportunity Solution Tree transforms product discovery from an occasional, unstructured activity into a continuous, disciplined practice. It ensures that everything your team builds is connected to a real customer need and a measurable outcome. That traceability. From strategy through discovery through delivery. Is what separates teams that ship features from teams that ship outcomes.
Related Comparisons
Explore More
- What Is an Opportunity Solution Tree? - Expert answer on opportunity solution trees for product discovery.
- Top 8 Product Discovery Frameworks (2026) - 8 proven discovery frameworks that help PMs find the right problems to solve.
- Discovery for Mid-Level Product Managers - Build a continuous discovery practice.
- Top 8 Agile Frameworks for Product Teams (2026) - 8 agile frameworks compared side by side for product teams.