Skip to main content

Accessibility Testing

Building accessible HTML is only half the job. You must verify that your site actually works for people using assistive technology. The best approach combines automated tools with manual testing.

Automated Testing

Automated tools can catch roughly 30-50% of accessibility issues. They are fast, consistent, and great for catching low-hanging fruit.

axe DevTools

The axe browser extension by Deque is the industry standard. Install it for Chrome or Firefox, open DevTools, and run a scan:

  1. Open the page you want to test
  2. Open DevTools and go to the "axe DevTools" tab
  3. Click "Scan ALL of my page"
  4. Review the list of issues, grouped by severity

Each issue includes the element, the rule violated, and a "How to fix" explanation. axe catches missing alt text, color contrast failures, missing form labels, duplicate IDs, and many more.

axe-core in CI

You can run axe programmatically in your test suite to prevent regressions:

// Using axe-core with Playwright
import { test, expect } from '@playwright/test';
import AxeBuilder from '@axe-core/playwright';

test('homepage has no accessibility violations', async ({ page }) => {
  await page.goto('/');

  const results = await new AxeBuilder({ page }).analyze();

  expect(results.violations).toEqual([]);
});

This catches accessibility regressions before they reach production. Run these tests in your CI pipeline alongside your other tests.

Lighthouse

Chrome's built-in Lighthouse audit includes an accessibility score. Open DevTools, go to the Lighthouse tab, and run an audit. It checks a subset of axe rules and gives you a score out of 100.

While useful for a quick overview, Lighthouse tests fewer rules than a full axe scan. Use both.

eslint-plugin-jsx-a11y

For React projects, this ESLint plugin catches accessibility issues in your JSX at development time:

{
  "extends": ["plugin:jsx-a11y/recommended"]
}

It flags missing alt props on <img>, click handlers on non-interactive elements, missing labels, and more — right in your editor.

Manual Testing

Automated tools cannot test the user experience. They cannot tell you if a screen reader announcement makes sense, if the focus order is logical, or if a custom widget is truly usable. Manual testing fills this gap.

Keyboard Testing

The simplest manual test requires no special tools:

  1. Put your mouse away
  2. Press Tab to navigate through the entire page
  3. Verify you can reach every interactive element
  4. Verify focus indicators are visible at all times
  5. Verify you can activate buttons with Enter/Space
  6. Verify you can open and close all modals and menus
  7. Verify focus returns to the trigger when dialogs close

Screen Reader Testing

Test with at least one screen reader. The most common pairings:

Screen ReaderPlatformBrowser
NVDAWindowsFirefox or Chrome
JAWSWindowsChrome
VoiceOvermacOS / iOSSafari
TalkBackAndroidChrome

Essential screen reader commands to learn (NVDA example):

  • NVDA + Down Arrow — read continuously from current position
  • H — jump to next heading
  • D — jump to next landmark
  • Tab — jump to next interactive element
  • Enter — activate link or button

When testing, listen for:

  • Do images have meaningful alt text (or are decorative images silent)?
  • Are form fields announced with their labels?
  • Do headings convey the page structure?
  • Are dynamic changes (notifications, errors) announced?
  • Can you complete core tasks (sign up, checkout, search)?

Color Contrast

Use a contrast checker to verify text meets WCAG AA minimums:

  • Normal text — 4.5:1 contrast ratio
  • Large text (18px+ bold or 24px+ regular) — 3:1 contrast ratio
  • UI components and graphics — 3:1 contrast ratio

Tools like the WebAIM Contrast Checker or the Chrome DevTools color picker show the contrast ratio and whether it passes.

A Testing Workflow

Combine automated and manual testing into a practical workflow:

  1. During development — eslint-plugin-jsx-a11y catches issues in your editor
  2. Before committing — run axe-core tests in your test suite
  3. During code review — run a quick keyboard navigation test
  4. Before release — do a full screen reader walkthrough of core user flows
  5. Monthly — run a comprehensive Lighthouse and axe scan of all pages

No single method catches everything. Automated tools catch the easy wins. Keyboard testing catches focus and interaction issues. Screen reader testing catches the experience issues that only a human can evaluate.