WCAG-Compliant and Still Inaccessible: The Sensory Layer Your Audit Skips

Cover Image for WCAG-Compliant and Still Inaccessible: The Sensory Layer Your Audit Skips

The WCAG 2.1 audit came back clean. Four people on the accessibility team reviewed the report, 38 success criteria passed, two minor contrast issues were logged and fixed. The compliance certificate went into the legal folder. The design lead signed off.

Three months later, a user with ADHD spent 11 minutes trying to complete a task the design team assumed took four. She gave up twice. The third attempt succeeded — barely — and she left the site with the particular exhaustion of someone who had to fight for something that shouldn't have required fighting.

Nothing in that experience would appear in a WCAG audit.

What WCAG Was Built to Measure

WCAG — the Web Content Accessibility Guidelines — emerged from the W3C in 1999 and has been updated through versions 2.0, 2.1, and now 2.2. The spec is organized around four principles: perceivable, operable, understandable, conformant. Success criteria are specific and binary: a minimum contrast ratio of 4.5:1 for normal body text, keyboard navigability for all interactive elements, text alternatives for non-text content.

That precision is the spec's strength and its structural limit. The measurement model works because it's objective. A color combination either passes or fails the contrast formula. A button either receives keyboard focus or doesn't. Compliance is verifiable by automated tools, repeatable across teams, and defensible in legal contexts.

What that model can't capture: the experience of a person with ADHD whose attention gets hijacked by a pulsing notification badge while they're trying to complete a form. The experience of an autistic user who closes a browser tab because a product video auto-played ambient sound they weren't expecting. The experience of a dyslexic user trying to read justified text where uneven word spacing breaks the visual flow.

None of these experiences are in scope. Not because the people who built WCAG didn't care about them. Because they don't reduce to pass/fail metrics.

The Sensory Layer the Spec Doesn't Reach

Neurodivergent users encounter barriers that aren't in WCAG's criteria — and they're consistent, reproducible, and predictable once you know to look for them.

ADHD users describe autoplaying video, hover-triggered navigation menus, and notification systems that layer multiple simultaneous alerts as involuntary attentional captures. The interface redirects their focus before they've finished the task they were on, and refocusing after that disruption costs cognitive work that neurotypical users don't have to spend. WCAG 2.1 Success Criterion 2.2.2 addresses moving and blinking content but includes explicit exceptions for auto-updating content, and applies only to content that starts automatically and lasts more than five seconds. The hover menu that expands over your target and requires three steps to dismiss is technically legal.

Autistic users report that high-contrast color schemes — those designed to meet WCAG's minimum thresholds — can tip into visually overwhelming at the intensity end. WCAG sets a floor for contrast. It says nothing about ceiling. A site can score maximum compliance on visual contrast and still produce sensory difficulty for users with light sensitivity or visual processing differences.

Dyslexic users consistently benefit from specific typographic choices: open typefaces with distinct letter forms, generous line spacing, and left-aligned text. Justified text — typographically neutral, legally unproblematic under WCAG — creates the irregular word spacing that many dyslexic readers experience as visual interference. Nothing in the audit flags it.

Stéphanie Walter, whose work on neurodiversity and UX has become a practitioner reference in inclusive design, documents over 60 specific design decisions that affect neurodivergent users. The majority of them are invisible to automated accessibility tooling.

Fifteen to Twenty Percent Is Not an Edge Case

Estimates of neurodivergence in the general population consistently range from 15 to 20 percent, covering ADHD, autism spectrum conditions, dyslexia, dyscalculia, dyspraxia, and processing differences. This is not a minority concern for niche use cases. It's a user segment larger than mobile-only users at many enterprise SaaS companies.

The reason neurodivergent accessibility tends to get treated as optional has two roots. The first is historical: the disability rights frameworks that shaped the original W3C guidelines in the late 1990s centered on visible, categorical disabilities — visual impairment, motor impairment, deafness. Cognitive accessibility wasn't part of the initial frame.

The second is practical difficulty. Sensory and cognitive barriers don't show up in automated scans. You need actual users, real tasks, and observation. You need participants who can articulate — or at least demonstrate — how the interface creates friction. Most design teams don't recruit for this.

The W3C's Cognitive Accessibility Task Force, which produced the Cognitive Accessibility Guidance document, conducted a practitioner survey in 2024 and found that only 11 percent of design teams included neurodivergent users in any stage of usability testing. That number is not a statement about indifference. It's a statement about where accessibility practice has been pointing — toward audit tooling, not toward human participants.

What a Sensory Accessibility Audit Actually Looks Like

Testing for cognitive and sensory accessibility doesn't require specialized hardware or months of preparation. It requires recruiting neurodivergent users — ADHD, autistic, dyslexic participants — giving them defined tasks, and watching where the interface creates unexpected friction.

The findings are typically immediate. Patterns that appear consistently across neurodivergent testing include: navigation menus that expand on hover and block the user's original target, requiring dismissal before continuing; form validation that clears all fields on error, which disproportionately punishes users who are prone to distraction or input errors; animated loading states that loop indefinitely without progress indication, which create sustained attentional load; notification systems that surface three or four alerts simultaneously without priority hierarchy, creating a triage task the user didn't consent to.

None of these patterns are obscure. None fail WCAG. All of them show up with predictable frequency when neurodivergent users complete standard task flows.

The WCAG 3.0 working draft, still in development as of 2026, introduced a new scoring model and explicitly integrated guidance from the COGA Task Force. This is real progress. WCAG 3.0 isn't finalized, and it won't be backward-compatible with existing WCAG 2.x compliance programs for years. The gap is real and currently unresolved at the standards level.

Compliance Is a Floor, Not a Destination

Glassmorphism Is the Best-Looking Design Trend That Fails Accessibility described a design pattern where aesthetic choices create measurable contrast failure — a problem WCAG catches. The neurodivergent problem has the opposite structure: not a pattern that breaks a rule the spec contains, but a category of user need the spec wasn't written to address.

Treating WCAG compliance as an accessibility strategy is like treating fire code compliance as a disability access strategy. The codes overlap in places — both require accessible exits — but they were designed to answer different questions. A building can be fully fire-safe and genuinely inaccessible. A website can be fully WCAG 2.1 compliant and genuinely unusable for a significant portion of its intended audience.

The teams moving toward real accessibility are the ones who use WCAG as the floor and user testing with the actual population as the ceiling. They're the teams who, after the audit comes back clean, ask: "Who is this still hard for?"

That question is not in the spec. Neither is the answer. But it's the only question that tells you what the audit doesn't.


Photo by Beyzaa Yurtkuran via Pexels.