Study: AudioEye detects up to 2.5x more issues than other tools
Get ReportEU Organisations Rely on Accessibility Tools to Meet the EAA. Independent Research Shows They Don’t All Find the Same Issues.
Independent research from Adience tested five leading accessibility tools under identical conditions and found wide variation in detection, with the widest gaps at WCAG Level AA, the standard underpinning EN 301 549 and the European Accessibility Act.
Author: Missy Jensen, Senior SEO Copywriter
Published: 04/09/2026
)
The European Accessibility Act(opens in a new tab) (EAA) is no longer a future deadline; enforcement has been active since June 28, 2025. Under EN 301 549, the technical standard that underpins the EAA, digital products and services must meet the guidelines outlined in the Web Content Accessibility Guidelines(opens in a new tab) (WCAG) 2.1 Level AA. For most organisations covered by the directive, that means automated accessibility testing is now a frontline compliance tool.
Most organisations are doing the logical thing: adopting automated tools to scan their digital properties against WCAG to identify what needs to be fixed. But the assumption that these tools will surface roughly the same issues turns out to be wrong.
Independent research from Adience, a B2B market research firm, found that issue detection varies significantly across leading platforms, with some returning zero findings at the very conformance level the EAA enforces.
Automated Tools Look Similar. Their Results Aren’t.
Whether you’re currently using an accessibility testing tool or not, most digital accessibility software describes its capabilities in similar terms. Comprehensive WCAG coverage. Automated scanning. Issue detection across conformance levels.
On paper, these tools can seem interchangeable. In practice, what each tool actually detects varies significantly, even when scanning the same page.
The number of rules an accessibility tool checks, how it interprets WCAG success criteria, and which conformance levels it covers all affect the results of a scan. A tool that returns fewer issues isn’t necessarily telling you your site is more accessible. It might simply be running fewer checks and or surfacing fewer issues.
To test this assumption under controlled conditions, independent research firm Adience evaluated how five different automated accessibility testing tools performed when scanning identical content.
What Adience Tested
Adience evaluated five leading automated accessibility platforms against six real-world websites: Best Buy, Starling Bank, Yale, U.S. Citizenship and Immigration Services, Cleveland Clinic, and Cognism. The sites were chosen to represent a wide range of industries, from large-scale eCommerce to government services to higher education.
Every tool scanned the same pages under identical, controlled conditions. The analysis focused exclusively on automated issue detection, including:
How many WCAG issues each tool surfaced.
Which WCAG success criteria each tool was capable of flagging.
How consistently each tool performed across all three WCAG conformance levels (A, AA, and AAA).
No manual testing, expert auditing, or human review was factored into the analysis. The test was a direct, apples-to-apples comparison of what each platform’s automation could find on its own.
That scope is key. Automated detection is the first layer most organisations rely on to identify compliance gaps at scale, especially under time pressure enforcement deadlines like the EAA’s. If that first line of defense misses issues, those issues remain invisible (and pose real risk) until a manual audit catches them, a user reports them, or an enforcement authority flags them.
Chris Wells, Managing Director at Adience, stated: “Adience applied the same rigorous testing approach to every tool, ensuring each was evaluated under identical, controlled conditions. What surprised us most was the significant scale of variation in the results.”
That variation showed up at every conformance level. But it was most consequential at Level AA.
)
What the Data Found
Level AA is the conformance level that matters most under EU accessibility laws. It’s the standard referenced by EN 301 549, and it’s where enforcement authorities will focus. But beyond compliance, Level AA criteria exist because they address the barriers that most directly affect real people trying to use digital products — insufficient color contrast, unlabeled buttons and form fields, inaccessible navigation, lack of compatibility with assistive technologies. These aren’t “nice-to-have” features. They’re the things that determine whether someone can complete a purchase, submit a form, or access a service.
And it’s precisely at this level where the Adience research found tools diverging most sharply.
Here’s what the data showed at Level AA across the six websites tested:
One tool returned zero Level AA findings across all six sites, meaning its automation detected no AA-level issues on any of the pages scanned.
One tool returned zero AA findings on three of the six sites.
Two tools returned zero AA findings on two of the size sites.
Only one tool returned Level AA findings on all six sites.
Across the study, detection gaps between the highest- and lowest-performing tools were substantial, ranging from 89% to 253% more WCAG issues identified. Only one tool detected valid issues at every conformance level across every website tested.
Why this Matters for Organisations Under the EAA
A tool that surfaces fewer issues can feel reassuring. A short report looks like progress, maybe even a fully ‘compliant’ site. But the Adience research makes clear that lower issue counts often reflect detection limits, not the absence of barriers.
For organisations building digital products and services in the EU, the risk is twofold.
First, it’s a user experience problem. Level AA criteria cover interactions that matter most to real people: whether a visually impaired user can distinguish a button from its background, whether a form works with assistive technology, whether someone using voice input can activate the controls they see on screen.
When a tool misses these issues, they don’t get fixed. The barriers stay in the product, and the people who encounter them either struggle through or leave. That’s not a compliance abstraction — it’s a measurable loss of usability, trust, and reach.
Second, it’s a regulatory exposure problem. The EAA doesn’t distinguish between “we didn’t know about the issue” and “we chose not to fix it.” Enforcement authorities across EU member states are actively monitoring compliance, and several countries, including France, Sweden, and Denmark, began outreach and enforcement actions within months of the June 2025 deadline. Put simply, a tool that returns no AA-level findings on pages where real issues exist doesn’t protect you from scrutiny.
The takeaway is straightforward: the tool you use to assess accessibility risk directly shapes what risks you can see. And what you can’t see, you can’t fix — for your users or for your compliance posture. This applies beyond the EU’s borders, too. The EAA does apply to businesses outside the EU that serve EU customers.
None of this means automated testing is the wrong approach: it’s an essential one. But the Adience data makes a compelling case that the question isn’t whether to use automated testing; rather, it's whether the tool you’re relying on shows you the full picture.
One tool in the study did. Consistently, across every site and every WCAG level tested.
)
Where AudioEye Created the Widest Separation
That tool? AudioEye. And the separation of the field? It wasn’t marginal.
Four findings from the research illustrate where AudioEye’s automated detection pulled furthest ahead:
509% more Level A issues than the lowest-performing tool. At the foundational level of WCAG, AudioEye identified five times as many valid issues as the weakest performer in the study.
10 unique WCAG success criteria detected, compared to 7-8 for all other tools. The breadth of what a tool can detect matters as much as the volume. AudioEye’s rule engine covered more individual success criteria than any other performer tested.
Only AudioEye detected WCAG 1.4.11 (Non-text Contrast) and WCAG 2.5.3 (Label in Name). Both are Level AAA criteria and directly relevant to EAA compliance. No other tool in the study flagged either criterion.
The only tool to detect issues at all three WCAG levels across all six sites. Consistency matters. A tool that performs well on some pages but returns nothing on others creates an incomplete and potentially misleading picture of your accessibility posture.
With accessibility compliance no longer theoretical, the tool you choose to ensure your digital content is accessible isn’t a back-office decision. It determines what gets fixed, what gets missed, and ultimately, whether the people you’re building for can actually use what you’ve built.
The Adience report shows that gap is wider than most organisations realize. AudioEye is how you close it. Start with a free scan to see what your current tool might be missing. Or schedule a demo to see the full platform.
Get the Full Adience Report
The complete research from Adience includes side-by-side detection results across all five tools, WCAG coverage breakdowns at Level A, AA, and AAA, valid issue comparisons across all six websites, and the full methodology.
This study was commissioned by AudioEye and conducted independently by Adience. Adience does not endorse any specific vendor. Findings reflect results under the stated test conditions.
Frequently Asked Questions
Share Article
)
)
)