What This Template Is For
Accessibility testing confirms that your product works for users who rely on assistive technology, keyboard navigation, screen magnification, or alternative input devices. Without a structured plan, teams rely on automated scanners alone. Automated tools catch roughly 30-40% of accessibility issues. The remaining 60-70% require manual evaluation with real assistive technology.
This template creates a structured accessibility test plan covering WCAG 2.1 AA criteria, screen reader testing protocols, keyboard navigation audits, color and contrast checks, and cognitive accessibility considerations. It organizes testing into automated scans, semi-automated checks, and manual evaluation sessions so teams know exactly what to test and how.
The template works for single-feature audits, full-product accessibility reviews, and pre-launch compliance checks. Use it alongside the test strategy template to integrate accessibility into your broader quality approach. The Technical PM Handbook covers how PMs can partner with engineering on non-functional requirements like accessibility. For tracking issues found during testing, the bug report template provides a structured format that includes severity and reproduction steps.
How to Use This Template
- Start by identifying the WCAG conformance level your product targets. Most teams aim for WCAG 2.1 AA, which is the legal standard in most jurisdictions and a practical baseline for inclusive design.
- Inventory the user flows that need testing. Prioritize flows by traffic volume and criticality. The checkout flow matters more than the settings page.
- For each flow, determine which WCAG criteria apply. Not every criterion applies to every page. A text-only page does not need audio alternative checks.
- Run automated scans first using axe, Lighthouse, or WAVE. These catch low-hanging issues quickly: missing alt text, color contrast failures, missing form labels, duplicate IDs.
- Follow automated scans with manual testing. Navigate the entire flow using only a keyboard. Test with at least two screen readers (VoiceOver on Mac, NVDA on Windows). Check focus management, reading order, and dynamic content announcements.
- Document findings with screenshots, screen reader output transcripts, and WCAG criterion references. This makes remediation faster because developers know exactly which criterion failed and how to verify the fix.
The Template
Test Plan Overview
| Field | Details |
|---|---|
| Product / Feature | [Name] |
| Test Plan Owner | [Name] |
| PM | [Name] |
| Target Conformance | [WCAG 2.1 AA] |
| Scope | [Full product / specific feature / specific pages] |
| Test Period | [Start date - End date] |
| Last Updated | [Date] |
User Flows Under Test
List every user flow that will be tested. Prioritize by traffic and business impact.
| Flow ID | User Flow | Pages Involved | Priority | Status |
|---|---|---|---|---|
| A1 | [e.g., Account registration] | [Registration, email verification, onboarding] | P0 | [Not started] |
| A2 | [e.g., Core product workflow] | [Dashboard, editor, save/publish] | P0 | [Not started] |
| A3 | [e.g., Billing and subscription] | [Pricing, checkout, invoice history] | P1 | [Not started] |
| A4 | [e.g., Search and filtering] | [Search results, filters, detail view] | P1 | [Not started] |
| A5 | [e.g., Settings and profile] | [Profile edit, notification preferences] | P2 | [Not started] |
Automated Scan Configuration
| Tool | Scope | Schedule | Owner | Ruleset |
|---|---|---|---|---|
| [axe-core / axe DevTools] | All pages in scope | [Every PR / daily / weekly] | [Engineering] | [WCAG 2.1 AA + Best Practices] |
| [Lighthouse CI] | Top 10 pages by traffic | [Every build in CI] | [Engineering] | [Accessibility audit, threshold 90+] |
| [WAVE browser extension] | Manually on new pages | [Before each release] | [QA] | [WCAG 2.1 AA errors and alerts] |
| [Pa11y CI] | Sitemap crawl | [Nightly] | [Engineering] | [WCAG 2.1 AA, ignore known issues] |
Manual Testing Matrix
Keyboard Navigation
Test every flow using only the keyboard. No mouse, no trackpad.
| Check | Pass Criteria | Flow A1 | Flow A2 | Flow A3 | Flow A4 | Flow A5 |
|---|---|---|---|---|---|---|
| Tab order follows visual order | Focus moves left-to-right, top-to-bottom logically | ☐ | ☐ | ☐ | ☐ | ☐ |
| All interactive elements reachable | Every button, link, input, and control receives focus | ☐ | ☐ | ☐ | ☐ | ☐ |
| Focus visible on all elements | Clear visible focus indicator on every focused element | ☐ | ☐ | ☐ | ☐ | ☐ |
| No keyboard traps | User can Tab into and out of every component | ☐ | ☐ | ☐ | ☐ | ☐ |
| Skip navigation link present | "Skip to main content" link as first focusable element | ☐ | ☐ | ☐ | ☐ | ☐ |
| Modal focus management | Focus trapped inside open modal, returns on close | ☐ | ☐ | ☐ | ☐ | ☐ |
| Dropdown/menu keyboard support | Arrow keys navigate, Escape closes, Enter selects | ☐ | ☐ | ☐ | ☐ | ☐ |
| Custom widgets operable | Sliders, date pickers, tabs work with keyboard | ☐ | ☐ | ☐ | ☐ | ☐ |
Screen Reader Testing
| Screen Reader | Browser | OS | Tester | Flows Tested | Status |
|---|---|---|---|---|---|
| VoiceOver | Safari | macOS | [Name] | [A1, A2] | [Not started] |
| NVDA | Chrome | Windows | [Name] | [A1, A2, A3] | [Not started] |
| JAWS | Chrome | Windows | [Name] | [A1] | [Not started] |
| TalkBack | Chrome | Android | [Name] | [A2] | [Not started] |
Screen reader checks per flow:
- ☐ Page title announced on navigation
- ☐ Headings hierarchy is logical (h1 > h2 > h3, no skipped levels)
- ☐ Images have descriptive alt text (decorative images have empty alt)
- ☐ Form fields have associated labels announced before input
- ☐ Error messages announced when they appear
- ☐ Dynamic content changes announced via ARIA live regions
- ☐ Tables have proper header associations (th scope)
- ☐ Links have descriptive text (no "click here" or "read more")
- ☐ Button labels describe their action
- ☐ Custom components have proper ARIA roles and states
Color and Contrast
| Check | Standard | Tool | Status |
|---|---|---|---|
| Text contrast ratio (normal text) | 4.5:1 minimum (WCAG AA) | [Colour Contrast Analyser / browser devtools] | ☐ |
| Text contrast ratio (large text) | 3:1 minimum (WCAG AA) | [Same tool] | ☐ |
| Non-text contrast (UI components) | 3:1 minimum (WCAG AA) | [Same tool] | ☐ |
| Information not conveyed by color alone | Color + icon/text/pattern | [Manual review] | ☐ |
| Focus indicator contrast | 3:1 against adjacent colors | [Manual review] | ☐ |
| Dark mode contrast (if applicable) | Same ratios as light mode | [Same tool] | ☐ |
Cognitive Accessibility
| Check | Pass Criteria | Status |
|---|---|---|
| Error messages are clear and specific | Message says what went wrong and how to fix it | ☐ |
| Form instructions appear before the input | Help text precedes the field, not only on error | ☐ |
| Timeout warnings are provided | User warned before session timeout, can extend | ☐ |
| Consistent navigation across pages | Nav order and labeling identical on every page | ☐ |
| No content flashes more than 3 times/second | Animations respect prefers-reduced-motion | ☐ |
| Reading level appropriate | Body text at grade 8-10 reading level | ☐ |
WCAG 2.1 AA Criteria Checklist
Track conformance against the four WCAG principles (POUR).
Perceivable
| Criterion | Description | Automated | Manual | Status |
|---|---|---|---|---|
| 1.1.1 Non-text Content | All images, icons have text alternatives | Yes | Verify quality | ☐ |
| 1.2.1 Audio-only/Video-only | Alternatives for media content | No | Full review | ☐ |
| 1.3.1 Info and Relationships | Semantic HTML, proper heading structure | Partial | Verify | ☐ |
| 1.3.2 Meaningful Sequence | DOM order matches visual order | No | Full review | ☐ |
| 1.4.1 Use of Color | Information not solely color-dependent | No | Full review | ☐ |
| 1.4.3 Contrast (Minimum) | 4.5:1 text, 3:1 large text | Yes | Spot check | ☐ |
| 1.4.4 Resize Text | Content usable at 200% zoom | Partial | Full review | ☐ |
| 1.4.11 Non-text Contrast | 3:1 for UI components and graphics | Partial | Full review | ☐ |
Operable
| Criterion | Description | Automated | Manual | Status |
|---|---|---|---|---|
| 2.1.1 Keyboard | All functionality keyboard-accessible | No | Full review | ☐ |
| 2.1.2 No Keyboard Trap | Focus can move freely into/out of components | No | Full review | ☐ |
| 2.4.1 Bypass Blocks | Skip navigation mechanism present | Partial | Verify | ☐ |
| 2.4.2 Page Titled | Descriptive, unique page titles | Yes | Verify quality | ☐ |
| 2.4.3 Focus Order | Tab order is logical and predictable | No | Full review | ☐ |
| 2.4.6 Headings and Labels | Descriptive headings and form labels | Partial | Verify quality | ☐ |
| 2.4.7 Focus Visible | Clear visible focus indicator | No | Full review | ☐ |
Understandable
| Criterion | Description | Automated | Manual | Status |
|---|---|---|---|---|
| 3.1.1 Language of Page | lang attribute set on | Yes | N/A | ☐ |
| 3.2.1 On Focus | No unexpected context change on focus | No | Full review | ☐ |
| 3.3.1 Error Identification | Errors clearly identified and described | No | Full review | ☐ |
| 3.3.2 Labels or Instructions | Forms have clear labels and instructions | Partial | Verify quality | ☐ |
Robust
| Criterion | Description | Automated | Manual | Status |
|---|---|---|---|---|
| 4.1.1 Parsing | Valid HTML, no duplicate IDs | Yes | N/A | ☐ |
| 4.1.2 Name, Role, Value | Custom controls have proper ARIA | Partial | Full review | ☐ |
Issue Tracking
| Issue ID | WCAG Criterion | Severity | Page/Flow | Description | Remediation | Owner | Status |
|---|---|---|---|---|---|---|---|
| A11Y-001 | [e.g., 1.4.3] | [Critical / Major / Minor] | [Page URL] | [Description of failure] | [Fix approach] | [Name] | [Open / Fixed / Accepted] |
Exit Criteria
Accessibility testing is complete when:
- ☐ All automated scans run with zero critical violations
- ☐ Keyboard navigation tested for all P0 and P1 flows
- ☐ Screen reader testing completed with at least 2 screen readers
- ☐ Color contrast meets WCAG AA ratios across all themes
- ☐ All critical and major issues resolved or have an accepted remediation timeline
- ☐ No keyboard traps exist anywhere in the tested flows
- ☐ ARIA implementation validated for all custom components
- ☐ Test results documented and shared with the team
Filled Example: SaaS Dashboard Accessibility Audit
Test Plan Overview
| Field | Details |
|---|---|
| Product / Feature | Analytics Dashboard v3.1 |
| Test Plan Owner | Alex Rivera, QA Lead |
| PM | Sarah Kim |
| Target Conformance | WCAG 2.1 AA |
| Scope | Dashboard, chart views, filter panel, export flows |
| Test Period | March 5 - March 19, 2026 |
Key Findings Summary
Automated scans (axe-core): 12 issues found. 4 critical (missing form labels on filter inputs), 5 major (low contrast text in chart legends), 3 minor (redundant ARIA roles).
Keyboard testing: Tab order breaks in the chart detail modal. Users cannot Tab out of the date range picker without pressing Escape first (keyboard trap). Filter chips are not keyboard-focusable.
Screen reader testing (VoiceOver + NVDA): Chart data is invisible to screen readers. SVG charts lack text alternatives. The "Applied filters" count badge updates silently without ARIA live region announcement. Data table sort controls announce as generic buttons instead of "Sort by Revenue, ascending."
Contrast: Chart color palette fails 3:1 non-text contrast for 3 of 8 data series colors. Legend text at 12px fails 4.5:1 against the gray background.
Issue Tracker (Sample)
| Issue ID | WCAG | Severity | Location | Description | Fix |
|---|---|---|---|---|---|
| A11Y-001 | 2.1.2 | Critical | Date picker | Keyboard trap in date range selector | Add Escape key handler and Tab-out support |
| A11Y-002 | 1.1.1 | Critical | All charts | SVG charts have no text alternative | Add descriptive aria-label and hidden data table |
| A11Y-003 | 1.4.3 | Major | Chart legend | 12px legend text at 3.2:1 contrast | Increase font to 14px or darken background |
| A11Y-004 | 4.1.2 | Major | Filter chips | Chips not keyboard-focusable | Add tabindex and key handlers to chip elements |
Key Takeaways
- Automated scans catch only 30-40% of accessibility issues. Manual testing with keyboards and screen readers is essential.
- Prioritize testing on high-traffic, business-critical flows first
- Test with at least two screen readers (VoiceOver + NVDA) to cover the majority of assistive technology users
- Integrate accessibility checks into CI to catch regressions early
- Document findings with WCAG criterion references so developers know exactly what to fix and how to verify
About This Template
Created by: Tim Adair
Last Updated: 3/5/2026
Version: 1.0.0
License: Free for personal and commercial use
