Quick Answer (TL;DR)
The Opportunity Solution Tree (OST) is a visual framework created by Teresa Torres that maps the path from a desired outcome to opportunities (customer needs, pain points, and desires) to solutions (features and ideas) to experiments (tests to validate assumptions). It structures continuous discovery so product teams can make better decisions, avoid building the wrong things, and maintain a clear connection between what they're building and why. The OST is the centerpiece of Torres' Continuous Discovery Habits methodology.
What Is an Opportunity Solution Tree?
An Opportunity Solution Tree is a hierarchical visual map that connects your team's desired business outcome to the specific experiments you're running today. It provides a clear, traceable line from strategy to execution.
The tree has four levels:
The OST was developed by Teresa Torres, a product discovery coach and author of Continuous Discovery Habits. It addresses one of the most persistent problems in product management: the gap between understanding what customers need and deciding what to build. Too often, product teams jump from "customers told us they want X" directly to "let's build X" without exploring the problem space, generating multiple solutions, or testing assumptions.
Why Trees, Not Lists?
Traditional backlogs are flat lists of features. They obscure the reasoning behind each item and make it impossible to see alternative solutions. The tree structure solves this by:
The Four Levels in Detail
Level 1: Outcome
The outcome is the single, measurable metric your team is trying to move. It's set by leadership (or negotiated between the team and leadership) and defines the team's mission for a given period.
What makes a good outcome:
Examples of good outcomes:
| Outcome | Why It Works |
|---|---|
| Increase 30-day retention from 65% to 75% | Measurable, directly tied to product value, within team's control |
| Reduce time-to-first-value from 5 days to 2 days | Specific, measurable, directly impacts activation |
| Increase weekly active usage from 3 to 5 sessions | Behavioral metric, tied to engagement and habit formation |
| Reduce support tickets related to billing by 40% | Measurable, specific problem area, clear success criteria |
Examples of poor outcomes:
Level 2: Opportunities
Opportunities are customer needs, pain points, and desires that, if addressed, would move the outcome. They come from research -- primarily customer interviews, but also support tickets, analytics, surveys, and observation.
Opportunities are NOT solutions. This distinction is critical:
| Solution (wrong) | Opportunity (right) |
|---|---|
| "Build a dashboard" | "Users can't quickly see whether their project is on track" |
| "Add Slack integration" | "Users miss important updates because they don't check our app frequently" |
| "Create onboarding wizard" | "New users don't understand what to do first after signing up" |
| "Add search filters" | "Users waste time scrolling through irrelevant results" |
How to discover opportunities:
The primary method is weekly customer interviews. Torres recommends that the product trio (PM, designer, engineer) interviews at least one customer per week, every week, as a sustainable habit. These aren't big, formal research projects -- they're lightweight, 30-minute conversations that continuously feed the opportunity space.
Interview structure for opportunity discovery:
Organizing opportunities on the tree:
Opportunities should be organized hierarchically. Broad opportunity areas break down into more specific sub-opportunities:
Outcome: Increase trial-to-paid conversion from 8% to 12%
├── Users don't understand the value proposition during trial
│ ├── Users don't know which features to try first
│ ├── Users don't see how the product fits their workflow
│ └── Users feel overwhelmed by too many options
├── Users hit technical barriers during setup
│ ├── Data import process is confusing
│ ├── Integration setup requires developer help
│ └── Users don't have the right data to get started
└── Users don't experience enough value before the trial ends
├── Trial period is too short for complex use cases
├── Users don't complete enough actions to see results
└── Users compare to free alternatives and don't see the premium value
Level 3: Solutions
Solutions are specific ideas for addressing each opportunity. Each opportunity should have multiple solutions -- this is where the tree structure shines. By generating 3-5 solutions per opportunity, you avoid fixating on the first idea and increase your chances of finding the best approach.
Generating solutions:
For each opportunity, the product trio brainstorms multiple possible solutions:
| Opportunity | Solution A | Solution B | Solution C |
|---|---|---|---|
| Users don't know which features to try first | Interactive onboarding checklist | Personalized "getting started" path based on role | "Quick wins" tutorial showing 3 high-value actions |
| Data import process is confusing | Wizard-style guided import | CSV template with pre-populated sample data | One-click import from common tools (Trello, Asana) |
| Users don't experience enough value before trial ends | Extend trial to 30 days | Pre-populate account with sample data to show value | Trigger a "value milestone" email when users complete key actions |
Key principles:
Level 4: Experiments
Experiments are tests designed to validate or invalidate the assumptions underlying each solution. This is where the OST prevents teams from building full features only to discover they don't work.
Assumption mapping:
Before designing experiments, identify the assumptions embedded in each solution:
For the solution "Interactive onboarding checklist":
Types of experiments:
| Experiment Type | What It Tests | Time to Run | Example |
|---|---|---|---|
| Customer interview | Desirability, understanding | 1-2 days | Show a concept mockup and gauge reaction |
| Prototype test | Usability, desirability | 3-5 days | Build a clickable prototype and observe users |
| Fake door test | Desirability at scale | 1-2 weeks | Add a button for the feature, measure clicks, show "coming soon" |
| Concierge test | Value proposition | 1-2 weeks | Manually deliver the solution to 5-10 users |
| A/B test | Impact on outcome | 2-4 weeks | Build a minimal version and measure conversion impact |
| Wizard of Oz | Full experience feasibility | 1-3 weeks | Simulate the feature with human effort behind the scenes |
Experiment design template:
For each experiment, document:
Building an OST: Step-by-Step
Step 1: Set Your Outcome (1 day)
Work with leadership to define a clear, measurable outcome for your team. If leadership assigns an output ("build feature X"), negotiate it into an outcome ("increase the metric that feature X is supposed to improve").
Step 2: Seed the Opportunity Space (2-4 weeks)
Conduct 6-10 customer interviews focused on stories related to your outcome. Synthesize findings into an initial set of opportunities. Organize them hierarchically on the tree.
Step 3: Prioritize Opportunities (1 day)
You can't pursue every opportunity simultaneously. Evaluate based on:
Select 1-3 opportunities to focus on for the next few weeks.
Step 4: Generate Solutions (Half day)
For each prioritized opportunity, brainstorm 3-5 solutions with your product trio. Don't evaluate yet -- just generate.
Step 5: Map Assumptions (Half day)
For each solution, identify the key assumptions that must be true for it to work. Categorize them as desirability, usability, feasibility, or viability.
Step 6: Design and Run Experiments (Ongoing)
Start with the riskiest assumptions first. Design lightweight experiments. Run them. Learn. Update the tree based on results.
Step 7: Iterate Continuously
The OST is a living document. Every week:
Real-World OST Example: Spotify's Podcast Discovery
Here's how Spotify might have structured an OST for improving podcast discovery:
Outcome: Increase podcast listening hours per user from 2 to 4 hours/week
├── Users don't know what podcasts to listen to
│ ├── Solution: Personalized podcast recommendations based on music taste
│ │ ├── Experiment: Interview 10 users about discovery behavior
│ │ └── Experiment: A/B test recommendation algorithm on 5% of users
│ ├── Solution: "Daily Podcast Mix" similar to Daily Mix for music
│ │ └── Experiment: Fake door test on home screen
│ └── Solution: Social sharing -- see what friends are listening to
│ └── Experiment: Survey 200 users on interest in social podcast features
├── Users start podcasts but don't finish
│ ├── Solution: Playback speed controls (1.5x, 2x)
│ │ └── Experiment: Usage data analysis of current speed control adoption
│ ├── Solution: Episode summaries with chapter markers
│ │ └── Experiment: Prototype test with 8 users
│ └── Solution: "Continue listening" prominent placement
│ └── Experiment: A/B test placement on home screen
└── Users forget to come back for new episodes
├── Solution: Smart notifications when new episodes drop
│ └── Experiment: Opt-in notification test with 1,000 users
├── Solution: Auto-download new episodes from followed podcasts
│ └── Experiment: Concierge test -- manually send "new episode" reminders
└── Solution: Weekly podcast digest email
└── Experiment: Email campaign to 5,000 users, measure re-engagement
Common Mistakes and Pitfalls
1. Starting with Solutions Instead of Opportunities
The most pervasive mistake. Teams put "Build feature X" on the tree without first understanding what customer need it addresses. If you can't articulate the opportunity, the solution doesn't belong on the tree yet.
2. Only One Solution Per Opportunity
If you have exactly one solution for each opportunity, you're not exploring the solution space. This leads to anchoring on the first idea and missing better alternatives. Force yourself to generate at least three solutions before committing.
3. Confusing Outputs with Outcomes
"Launch the new onboarding flow" is an output. "Increase activation rate from 30% to 45%" is an outcome. If your tree starts with an output, the entire structure is compromised because you've already decided on the solution before exploring alternatives.
4. Building the Tree Once and Never Updating It
The OST is a living artifact. It should change weekly as you learn from interviews and experiments. A stale tree is just a pretty diagram. Build the habit of updating the tree every week during your trio sync.
5. Skipping Experiments
Some teams use the OST to organize their thinking but then skip the experiment layer and jump straight to building. This defeats the purpose. The experiment layer is where you de-risk solutions before committing engineering resources.
6. Making the Tree Too Big
A tree with 50 opportunities, 200 solutions, and 400 experiments is unusable. Focus on the 3-5 most important opportunities and the 2-3 most promising solutions for each. Archive the rest. You can always bring them back.
7. Working in Isolation
The OST is designed for the product trio (PM + designer + engineer) to build and maintain together. A tree built by the PM alone misses technical feasibility insights from engineering and usability insights from design.
The Continuous Discovery Habits Behind OST
Teresa Torres' framework isn't just about the tree -- it's about the habits that keep the tree alive and useful:
Habit 1: Weekly Customer Interviews
Interview at least one customer per week. Not a big research project -- a lightweight, 30-minute conversation. Use an automated recruiting system so interviews happen consistently without manual scheduling overhead each time.
Habit 2: Interview Snapshots
After each interview, create a one-page summary capturing:
Habit 3: Opportunity Mapping
Weekly, review your interview snapshots and update the opportunity space on your tree. Are new opportunities emerging? Are existing opportunities being validated by multiple interviews?
Habit 4: Assumption Testing
Continuously identify and test the riskiest assumptions in your tree. Run small experiments weekly -- not monthly or quarterly. This keeps the learning cycle fast and prevents big bets on unvalidated assumptions.
Habit 5: Trio Decision-Making
The product trio (PM, designer, engineer) makes discovery decisions together. Not the PM deciding and informing the others. Shared understanding leads to better solutions and stronger buy-in.
OST vs. Other Discovery Frameworks
| Factor | OST | Design Thinking | JTBD | Lean Startup | Story Mapping |
|---|---|---|---|---|---|
| Focus | Connecting outcomes to experiments | Creative problem-solving | Customer motivations | Business validation | User journey and delivery planning |
| Cadence | Continuous (weekly) | Project-based | Research-phase | Build-measure-learn cycles | Planning-phase |
| Key artifact | Visual tree | Prototypes | Job maps | MVPs | Story map |
| Team involved | Product trio | Cross-functional | Research team | Founding/product team | Cross-functional |
| Best for | Ongoing discovery alongside delivery | Tackling ambiguous problems | Understanding "why" behind behavior | Validating business models | Planning releases |
| Unique strength | Maintains connection between strategy and execution | Deep user empathy | Reveals hidden motivations | Speed to market learning | Visualization of user experience |
OST + JTBD
JTBD interviews are an excellent way to seed the opportunity space in your OST. JTBD reveals the jobs customers are trying to do and the underserved outcomes within those jobs. These become opportunities on your tree.
OST + RICE
Once you've identified opportunities and solutions on your tree, use RICE scoring to prioritize which solutions to pursue first. The OST ensures you're scoring the right things; RICE ensures you're sequencing them effectively.
Best Practices for OST Implementation
Start Small
Don't try to build a complete tree in one session. Start with your outcome and 3-5 opportunities from recent customer conversations. Add to it weekly as you learn more.
Make It Visible
Put your OST on a wall, in Miro, in FigJam, or in a dedicated tool. Make it the first thing the team sees during planning sessions. Visibility keeps the tree alive and relevant.
Use the Tree in Stakeholder Conversations
When a stakeholder asks "Why aren't we building feature X?" walk them through the tree. Show how the outcome connects to opportunities, opportunities to solutions, and how you're testing solutions through experiments. This builds trust in your decision-making process.
Review the Tree Weekly
During your weekly product trio sync:
Track Your Discovery Velocity
Measure how many interviews you're conducting per week, how many experiments you're running, and how many assumptions you're validating. These leading indicators tell you whether your discovery process is healthy.
Connect Discovery to Delivery
The OST bridges discovery and delivery. When an experiment validates a solution, it moves into your delivery backlog with full context: the outcome it serves, the opportunity it addresses, the evidence supporting it, and the assumptions that have been validated. Engineers who understand the "why" build better solutions.
Getting Started with Opportunity Solution Trees
The Opportunity Solution Tree transforms product discovery from an occasional, unstructured activity into a continuous, disciplined practice. It ensures that everything your team builds is connected to a real customer need and a measurable outcome. That traceability -- from strategy through discovery through delivery -- is what separates teams that ship features from teams that ship outcomes.