Back to Blog

The AudioEye Accessibility Score: How is it calculated and what does it mean for your digital accessibility?

Accessibility Score before AudioEye at 67/100 and after activating AudioEye at 92/100

Summary: AudioEye's Live Monitoring scans website elements each time a user loads a page. We test them against accessibility coding frameworks. On average, we scan our websites almost 1,000 times per day. We provide a single Accessibility Score to track progress. This articles covers what this score represents and how it’s calculated.

AudioEye's Live Monitoring scans HTML elements and their attributes each time a user loads a web page, testing them against the A and AA success criteria in the Web Content Accessibility Guidelines (WCAG) and other accessible coding frameworks.

On average, the websites we monitor undergo almost 1,000 scans daily, which we roll up into a single convenient Accessibility Score, delivered to each client’s Admin Portal. Let’s learn what this score represents and how it’s calculated.

How AudioEye Calculates the Accessibility Score

Web pages are built using elements written in HTML and have attributes that define their behavior or appearance. These elements affect the degree to which each web page is accessible to people living with a disability.

For example, failing to ensure sufficient color contrast between foreground text and the page background could make it difficult for a visually impaired user to read the text. In addition, the choice of font for that same piece of text could determine its readability for a person with dyslexia, which means that any single element may need to be tested against multiple WCAG Success Criteria.

A visual breakdown of an HTML Element on a website showing Head and Body Elements as well as Images, Links, and other Elements

(A visual breakdown of an HTML Element on a website showing two elements Head and Body, and all the sub-elements that make up those, including Links, Navigations, Images, Text, Forms, and other website Elements.)

A single test can produce two or more results across four categories: 

  1. Pass—the test detected no failures. 
  2. Risk—the test detected a pattern that must be interpreted before action can take place. 
  3. Error—the test detected a failure to fulfill one or more success criteria. 
  4. Review—The test was unable to programmatically determine whether the element was fulfilling its success criteria. This outcome means that manual testing is required.  

We assign all Error and Risk results an Impact score, which is a numeric assessment of their severity. For example, when a test for image text alternatives is categorized as an Error, this will produce a high Impact score—because this class of error can represent a significant barrier for users attempting to complete their goal on your site or page.

By contrast, when a test relating to recode technique or universal design is categorized as a Risk, it may have a lower Impact score because the user’s OS, Assistive Technology, or standard browser may be able to mitigate the fault through features such as magnification or a simplified view. A higher Impact score naturally lowers the overall Accessibility Score. 

A visual breakdown of an Element on a website and the different tests that AudioEye runs to look for Risks and Errors

(A visual breakdown of an Element on a website and the three different tests that AudioEye runs to look for Risks and Errors. Each Result is given its own impact score from A to F.)

The Accessibility Score is on a scale of 0 to 100, with all tested elements feeding into the total score. A score of 0 represents the worst possible result on every test, while a score of 100 represents the best possible result on every test. 

AudioEye ranks scores below 50 as “Poor” and color-codes them red in the Admin Portal. Websites in this category will typically feature a large number of potentially serious obstacles to accessibility.  

Okay” scores fall between 50 and 74, and are color-coded yellow. Websites in this category will typically feature some medium-level obstacles to accessibility.  

Green scores of 75 or more are ranked as “Good”; websites in this category will typically present no significant obstacles to users with disabilities.

The Accessibility Score is designed to provide an at-a-glance overview of your website’s accessibility, and to act as a benchmark for ongoing improvements. While WCAG is the globally accepted standard for digital accessibility, many of its guidelines are subject to human interpretation and judgement.

Even if you could achieve full conformance to WCAG, this would not necessarily guarantee compliance in any given legal jurisdiction. The AudioEye Accessibility Score should therefore be seen as just one element within a holistic approach to measuring and maintaining accessibility.

Update: Toolbar Improvements 

AudioEye is always working on improving the efficacy and efficiency of our test suite by adding dimensions and tests as new success criteria are recommended. When we make these updates, site scores will naturally tend to change.  

Recent enhancements to the AudioEye toolbar functionality will mitigate some existing WCAG issues across 18 tests, producing Pass results for elements that would otherwise have generated Risk or Error results. Taking the toolbar’s functionality into account in this way allows us to show you a more complete picture of the real-life accessibility of your site or page.  

After we deploy this change and complete post-deployment site monitoring, all users will a see baseline increase for their sites on the Client Portal. 


Learn what your site’s Accessibility Score is by starting your free 30-day trial today

Share this post