Building accessible HTML is only half the job. You must verify that your site actually works for people using assistive technology. The best approach combines automated tools with manual testing.
Automated Testing
Automated tools can catch roughly 30-50% of accessibility issues. They are fast, consistent, and great for catching low-hanging fruit.
axe DevTools
The axe browser extension by Deque is the industry standard. Install it for Chrome or Firefox, open DevTools, and run a scan:
- Open the page you want to test
- Open DevTools and go to the "axe DevTools" tab
- Click "Scan ALL of my page"
- Review the list of issues, grouped by severity
Each issue includes the element, the rule violated, and a "How to fix" explanation. axe catches missing alt text, color contrast failures, missing form labels, duplicate IDs, and many more.
axe-core in CI
You can run axe programmatically in your test suite to prevent regressions:
// Using axe-core with Playwright
import { test, expect } from '@playwright/test';
import AxeBuilder from '@axe-core/playwright';
test('homepage has no accessibility violations', async ({ page }) => {
await page.goto('/');
const results = await new AxeBuilder({ page }).analyze();
expect(results.violations).toEqual([]);
});This catches accessibility regressions before they reach production. Run these tests in your CI pipeline alongside your other tests.
Lighthouse
Chrome's built-in Lighthouse audit includes an accessibility score. Open DevTools, go to the Lighthouse tab, and run an audit. It checks a subset of axe rules and gives you a score out of 100.
While useful for a quick overview, Lighthouse tests fewer rules than a full axe scan. Use both.
eslint-plugin-jsx-a11y
For React projects, this ESLint plugin catches accessibility issues in your JSX at development time:
{
"extends": ["plugin:jsx-a11y/recommended"]
}It flags missing alt props on <img>, click handlers on non-interactive elements, missing labels, and more — right in your editor.
Manual Testing
Automated tools cannot test the user experience. They cannot tell you if a screen reader announcement makes sense, if the focus order is logical, or if a custom widget is truly usable. Manual testing fills this gap.
Keyboard Testing
The simplest manual test requires no special tools:
- Put your mouse away
- Press Tab to navigate through the entire page
- Verify you can reach every interactive element
- Verify focus indicators are visible at all times
- Verify you can activate buttons with Enter/Space
- Verify you can open and close all modals and menus
- Verify focus returns to the trigger when dialogs close
Screen Reader Testing
Test with at least one screen reader. The most common pairings:
| Screen Reader | Platform | Browser |
|---|---|---|
| NVDA | Windows | Firefox or Chrome |
| JAWS | Windows | Chrome |
| VoiceOver | macOS / iOS | Safari |
| TalkBack | Android | Chrome |
Essential screen reader commands to learn (NVDA example):
- NVDA + Down Arrow — read continuously from current position
- H — jump to next heading
- D — jump to next landmark
- Tab — jump to next interactive element
- Enter — activate link or button
When testing, listen for:
- Do images have meaningful alt text (or are decorative images silent)?
- Are form fields announced with their labels?
- Do headings convey the page structure?
- Are dynamic changes (notifications, errors) announced?
- Can you complete core tasks (sign up, checkout, search)?
Color Contrast
Use a contrast checker to verify text meets WCAG AA minimums:
- Normal text — 4.5:1 contrast ratio
- Large text (18px+ bold or 24px+ regular) — 3:1 contrast ratio
- UI components and graphics — 3:1 contrast ratio
Tools like the WebAIM Contrast Checker or the Chrome DevTools color picker show the contrast ratio and whether it passes.
A Testing Workflow
Combine automated and manual testing into a practical workflow:
- During development — eslint-plugin-jsx-a11y catches issues in your editor
- Before committing — run axe-core tests in your test suite
- During code review — run a quick keyboard navigation test
- Before release — do a full screen reader walkthrough of core user flows
- Monthly — run a comprehensive Lighthouse and axe scan of all pages
No single method catches everything. Automated tools catch the easy wins. Keyboard testing catches focus and interaction issues. Screen reader testing catches the experience issues that only a human can evaluate.