What This Template Is For
A Definition of Done (DoD) is the team's shared agreement on what "done" actually means. Without one, "done" becomes whatever the person who wrote the code thinks it means. One engineer calls it done after the PR is merged. Another waits until it is in production. A third never writes tests. The result is inconsistent quality and surprise bugs after every sprint.
This template gives you a structured DoD checklist that covers the full delivery pipeline: code quality, testing, documentation, accessibility, and deployment readiness. Customize it for your team's stack and standards. If you are building your team's agile practices from scratch, pair this with the Definition of Ready template to cover both ends of the story lifecycle. The Product Operations Handbook covers how to embed quality standards into your team's workflow.
When to Use This Template
- New team formation: Establish quality standards before the first sprint.
- After recurring quality issues: If bugs keep shipping to production, your DoD has gaps.
- When onboarding new engineers: The DoD sets expectations for what "complete" looks like.
- During agile transformation: Replace informal "it works on my machine" standards with explicit criteria.
Step-by-Step Instructions
Step 1: Draft the Checklist (15 minutes)
Start with the template below and remove items that do not apply to your stack. Add any team-specific standards (e.g., specific linting rules, security scans).
- ☐ Review each category (code, testing, docs, accessibility, deployment)
- ☐ Remove items that do not apply to your team
- ☐ Add team-specific standards
- ☐ Flag any items the team cannot currently meet (improvement opportunities)
Step 2: Review with the Team (30 minutes)
The DoD is a team agreement, not a PM mandate. Walk through each item and get explicit buy-in.
- ☐ Present the draft in a team meeting
- ☐ Discuss and resolve disagreements on each item
- ☐ Agree on which items are mandatory vs. aspirational
- ☐ Document the final version in your team wiki or project repo
Step 3: Apply and Iterate (Ongoing)
Use the DoD as a checklist during code review and sprint review. Update it every quarter or when the team's practices change.
- ☐ Add the DoD checklist to your PR template
- ☐ Reference it during sprint reviews
- ☐ Review and update quarterly in a retrospective
The Definition of Done Template
Code Quality
- ☐ Code is peer-reviewed and approved (at least one reviewer)
- ☐ Code follows team style guide and passes linting
- ☐ No compiler warnings or errors
- ☐ No hardcoded secrets, API keys, or credentials
- ☐ Technical debt is documented if any was introduced
Testing
- ☐ Unit tests written and passing for new functionality
- ☐ Integration tests updated for affected flows
- ☐ Edge cases and error states tested
- ☐ Manual QA completed on staging environment
- ☐ No regressions in existing test suite
Documentation
- ☐ Code comments explain non-obvious logic
- ☐ API documentation updated (if applicable)
- ☐ User-facing help docs updated (if applicable)
- ☐ Release notes drafted for user-visible changes
- ☐ Architecture decision records updated (if applicable)
Accessibility
- ☐ UI meets WCAG 2.1 AA standards
- ☐ Keyboard navigation works for new UI elements
- ☐ Screen reader tested (at least one screen reader)
- ☐ Color contrast meets minimum ratios
- ☐ Form inputs have proper labels and error messages
Deployment Readiness
- ☐ Feature flag configured (if applicable)
- ☐ Monitoring and alerts set up for new endpoints
- ☐ Rollback plan documented
- ☐ Performance tested (no significant regressions)
- ☐ Database migrations tested in staging
Example
Here is a filled-in DoD for a B2B SaaS team building a dashboard application.
Code Quality
- ☑ Code is peer-reviewed and approved (at least one reviewer)
- ☑ Code follows team style guide and passes ESLint + Prettier
- ☑ No TypeScript errors or warnings
- ☑ No hardcoded secrets (checked with git-secrets pre-commit hook)
- ☑ Technical debt: added TODO for refactoring the chart rendering module
Testing
- ☑ Unit tests written for the new date range filter (12 tests, all passing)
- ☑ Integration tests updated for the report generation flow
- ☑ Edge cases tested: empty data, single data point, 10K+ rows
- ☑ Manual QA completed on staging by QA lead
- ☑ Full test suite passing (247/247 tests green)
Documentation
- ☑ Code comments added to the custom aggregation algorithm
- ☑ REST API docs updated with new
/reports/scheduleendpoint - ☑ Help center article drafted for the report scheduling feature
- ☑ Release notes: "You can now schedule recurring reports from the dashboard"
Accessibility
- ☑ Date picker is keyboard-navigable
- ☑ Screen reader tested with VoiceOver on macOS
- ☑ Color contrast passes (checked with Axe browser extension)
- ☑ Form error messages announced by screen readers
Deployment Readiness
- ☑ Feature flag
REPORT_SCHEDULINGconfigured in LaunchDarkly - ☑ Datadog alerts set for
/reports/scheduleerror rate > 1% - ☑ Rollback: disable feature flag, no migration rollback needed
- ☑ Load tested: 500 concurrent scheduled reports, p99 < 800ms
Tips
- Start small and expand. A 5-item DoD that the team actually follows beats a 30-item checklist that everyone ignores. Add items as the team matures.
- Make it visible. Post the DoD in your PR template, your team wiki, and your sprint board. If people have to search for it, they will not use it.
- The DoD is a team agreement. The PM can propose items, but engineers need to agree. A DoD imposed from above breeds resentment, not quality.
- Distinguish "done" from "accepted." Done means the team considers it complete. Accepted means the product manager has verified it meets the acceptance criteria. Both should happen before a story is closed.
- Track DoD compliance. If the same checklist item keeps getting skipped, either the item is unrealistic or the team needs better tooling. Discuss it in the retrospective.
