What This Template Is For
A proof of concept should answer one question: does this product solve the customer's problem in their real environment? Too many POCs drift into extended free trials with no clear success criteria, no evaluation timeline, and no commitment to make a decision at the end. This template gives PMs a structured framework for designing, running, and evaluating POCs that lead to clear go/no-go decisions.
The template covers three phases: POC design (before), execution tracking (during), and evaluation (after). Use it for enterprise deals where the buyer requires hands-on validation before signing a contract.
For gathering the requirements that feed into POC criteria, see the enterprise requirements template. For broader guidance on discovery and validation, the Product Discovery Handbook covers structured evaluation methods.
When to Use This Template
- Enterprise sales cycle: When the buyer requires a POC or pilot before committing to a contract.
- Technical evaluation: When the customer's engineering team needs to validate integration, performance, or security claims.
- Competitive bake-off: When the customer is evaluating your product against 1-2 competitors in parallel.
- New use case validation: When a customer wants to use your product for a use case you have not explicitly supported before.
- Large deal justification: When the deal size requires procurement to see evidence of value before approving the purchase.
- Renewal risk mitigation: When an at-risk account needs to re-validate value before renewing.
POC anti-pattern: If the customer wants a "POC" with no defined success criteria, no timeline, and no commitment to decide afterward, they are asking for a free trial, not a POC. Push for structure or decline the request.
Step-by-Step Instructions
Step 1: Design the POC (20 minutes)
Before the POC starts, align on success criteria, timeline, participants, and the decision framework.
- ☐ Define 3-5 measurable success criteria with the customer
- ☐ Set a fixed timeline (2-4 weeks is typical)
- ☐ Identify participants (users, evaluators, decision makers)
- ☐ Agree on the decision process: who decides, what happens if criteria are met, what happens if they are not
Step 2: Execute and Track (During the POC)
Monitor progress weekly. Do not wait until the end to discover the customer has not used the product.
- ☐ Check usage metrics weekly (logins, feature usage, data volume)
- ☐ Hold mid-point check-in with the customer
- ☐ Address blockers within 24 hours
- ☐ Document feedback as it surfaces
Step 3: Evaluate Results (15 minutes)
After the POC ends, score each criterion, collect stakeholder feedback, and prepare the evaluation summary.
Step 4: Present and Decide (10 minutes)
Present findings to the customer's decision-making committee. Use the go/no-go framework to guide the conversation toward a commitment.
The POC Evaluation Template
Section 1: POC Design
| Field | Details |
|---|---|
| Customer | [Company name] |
| Deal value | $[Estimated ARR] |
| POC start date | [Date] |
| POC end date | [Date] |
| Duration | [X weeks] |
| POC type | [Technical validation / Business value / Competitive bake-off] |
| PM owner | [Name] |
| SE owner | [Name] |
| Sales owner | [Name] |
| Customer POC lead | [Name, Title] |
| Customer decision maker | [Name, Title] |
| Competitors in evaluation | [Names or "None"] |
Section 2: Success Criteria
Define 3-5 measurable criteria before the POC begins. Each criterion must have a target value and a measurement method. Vague criteria like "users like the product" are not acceptable.
| ID | Criterion | Target | Measurement Method | Weight | Status |
|---|---|---|---|---|---|
| SC-01 | [e.g., Complete end-to-end workflow in under 10 minutes] | [10 min or less] | [Screen recording + timer] | [30%] | Not started |
| SC-02 | [e.g., Import 10,000 records with <1% error rate] | [<1% errors] | [Import log report] | [25%] | |
| SC-03 | [e.g., SSO integration configured and working] | [All test users login via SSO] | [Login audit log] | [20%] | |
| SC-04 | [e.g., Reporting dashboard matches current manual process] | [95% parity] | [Side-by-side comparison] | [15%] | |
| SC-05 | [e.g., User satisfaction score from test group] | [4.0+ out of 5.0] | [Post-POC survey] | [10%] |
Total weight must equal 100%.
Customer sign-off on criteria: [Name, Date, or "Pending"]
Section 3: POC Scope and Boundaries
In Scope
- ☐ [Feature or workflow 1]
- ☐ [Feature or workflow 2]
- ☐ [Integration with system X]
- ☐ [Data import from source Y]
- ☐ [User roles: Admin, Manager, End User]
Explicitly Out of Scope
- ☐ [Feature or workflow that will not be tested]
- ☐ [Integration that is deferred to implementation phase]
- ☐ [Performance testing at production scale]
- ☐ [Custom development or configuration beyond standard offering]
POC Environment
| Dimension | Details |
|---|---|
| Environment | [Production / Sandbox / Dedicated POC instance] |
| Data | [Real data / Synthetic data / Subset of production data] |
| Users | [X test users across Y roles] |
| Integrations active | [List which integrations will be live during POC] |
| Support level | [Dedicated SE / Standard support / Self-serve] |
Section 4: Execution Tracker
Week-by-Week Progress
| Week | Planned Activities | Actual Progress | Blockers | Resolution |
|---|---|---|---|---|
| Week 1 | [Environment setup, user onboarding, initial data import] | [What actually happened] | [Any blockers] | [How resolved] |
| Week 2 | [Feature testing, integration validation, user workflows] | |||
| Week 3 | [Advanced use cases, edge cases, reporting validation] | |||
| Week 4 | [Final testing, survey, evaluation meeting] |
Usage Metrics
Track weekly to catch non-engagement early.
| Metric | Week 1 | Week 2 | Week 3 | Week 4 | Target |
|---|---|---|---|---|---|
| Active users (logged in) | [X/Y] | [Y users] | |||
| Sessions per user | [X] | [3+/week] | |||
| Core features used | [X/Y] | [Y features] | |||
| Data records processed | [X] | [X target] | |||
| Support tickets filed | [X] | [<5] |
Feedback Log
| Date | Source | Feedback | Category | Severity | Response |
|---|---|---|---|---|---|
| [Date] | [Name, Role] | [Feedback verbatim] | Feature / Bug / UX / Performance | Critical / Major / Minor | [Action taken] |
Section 5: Evaluation Results
Success Criteria Scoring
| ID | Criterion | Target | Actual Result | Score | Notes |
|---|---|---|---|---|---|
| SC-01 | [Criterion from Section 2] | [Target] | [Measured result] | Pass / Partial / Fail | [Context] |
| SC-02 | |||||
| SC-03 | |||||
| SC-04 | |||||
| SC-05 |
Weighted score: [Calculate weighted average based on weights in Section 2]
Scoring guide:
- Pass (100%): Met or exceeded the target value
- Partial (50%): Partially met. Workaround exists or minor gap
- Fail (0%): Did not meet the target. Significant gap
Weighted score thresholds:
- 80-100%: Strong pass. Proceed to contract.
- 60-79%: Conditional pass. Address gaps with a remediation plan and timeline.
- Below 60%: Fail. Do not proceed unless gaps can be closed within 30 days.
Stakeholder Feedback Summary
| Stakeholder | Role | Overall Rating (1-5) | Top Positive | Top Concern | Purchase Recommendation |
|---|---|---|---|---|---|
| [Name] | Business Sponsor | [X] | [What they valued most] | [Biggest concern] | Yes / Conditional / No |
| [Name] | End User | [X] | |||
| [Name] | IT/Engineering | [X] | |||
| [Name] | Security | [X] |
Average stakeholder rating: [X/5.0]
Competitive Comparison (if applicable)
| Dimension | Your Product | Competitor A | Competitor B |
|---|---|---|---|
| Success criteria met | [X/5] | [X/5] | [X/5] |
| User satisfaction score | [X/5.0] | [X/5.0] | [X/5.0] |
| Time to value (days to first workflow) | [X days] | [X days] | [X days] |
| Integration readiness | [Strong / Adequate / Weak] | ||
| Security/compliance fit | [Strong / Adequate / Weak] | ||
| Pricing competitiveness | [Favorable / Comparable / Unfavorable] |
Section 6: Gap Analysis and Remediation
| ID | Gap | Severity | Remediation Plan | Owner | Timeline | Cost |
|---|---|---|---|---|---|---|
| G-01 | [Gap from failed or partial criteria] | Critical / Major / Minor | [Configuration / Custom dev / Roadmap / Workaround] | [Name] | [Date] | [$X or included] |
| G-02 | ||||||
| G-03 |
Total remediation effort: [X engineering weeks]
Remediation cost: [$X or included in contract]
Section 7: Go/No-Go Decision
| Factor | Assessment |
|---|---|
| Success criteria score | [X%] |
| Stakeholder recommendation | [Majority Yes / Mixed / Majority No] |
| Gap severity | [No critical gaps / Critical gaps with remediation plan / Unresolvable gaps] |
| Competitive position | [Winning / Tied / Losing] |
| Deal economics | [Profitable / Break-even / Requires discount] |
Decision: Proceed to Contract / Conditional Proceed / Extend POC / No-Go
Conditions (if conditional):
- [Condition 1 with deadline]
- [Condition 2 with deadline]
Next steps:
- [Action with owner and date]
- [Action with owner and date]
- [Action with owner and date]
Filled-Out Example: Analytics Platform POC
POC Design (Example)
| Field | Details |
|---|---|
| Customer | RetailTech Inc. |
| Deal value | $95,000 ARR |
| Duration | 3 weeks (Feb 10 - Feb 28, 2026) |
| POC type | Technical validation + competitive bake-off vs. Amplitude |
Success Criteria Results (Example)
| Criterion | Target | Result | Score |
|---|---|---|---|
| Ingest 500K events/day with <2s latency | <2s p95 | 1.4s p95 | Pass |
| Custom dashboard creation in <15 min | <15 min | 12 min avg | Pass |
| Segment integration (bidirectional) | Working sync | Outbound only (inbound roadmapped) | Partial |
| User satisfaction from 8 test users | 4.0+/5.0 | 4.3/5.0 | Pass |
Weighted score: 87%. Strong pass.
Decision (Example)
Proceed to contract with one condition: inbound Segment sync delivered within 60 days of contract signing. Engineering confirmed feasibility. Customer accepted timeline.
Tips for Running Effective POCs
- Set a fixed end date. POCs without deadlines become perpetual evaluations. Two to four weeks is sufficient for most enterprise products. If the customer needs more time, something is wrong with the scope or the engagement.
- Check usage in week one. If the customer has not logged in by day 5, the POC is at risk. Proactively reach out, offer a working session, and remove friction. Non-engagement is the number one POC failure mode.
- Treat the mid-point check-in as mandatory. At the halfway mark, review progress against criteria. Adjust scope if needed. This prevents surprises in the final evaluation. For structuring stakeholder alignment throughout, see the Stakeholder Management Handbook.
- Get written criteria sign-off. Verbal agreement on success criteria is not enough. Send an email summary and get explicit confirmation. This protects both sides during the evaluation discussion.
- Prepare the evaluation presentation. Do not hand the customer raw data and ask them to interpret it. Present the results in a clear, visual format that makes the go/no-go decision obvious. Use the stakeholder review template for presentation structure.
- Know when to walk away. If the weighted score is below 60% and the gaps require months of engineering work, recommend a no-go. Your engineering time is better spent on customers where the product fits today.
Key Takeaways
- Define 3-5 measurable success criteria before the POC starts
- Set a fixed timeline of 2-4 weeks and stick to it
- Monitor usage weekly and intervene immediately if engagement drops
- Use a weighted scoring model for objective go/no-go decisions
- Walk away from deals where the product does not fit rather than extending indefinitely
About This Template
Created by: Tim Adair
Last Updated: 3/5/2026
Version: 1.0.0
License: Free for personal and commercial use
