100 Days at AudioEye: 60-Day Check-In

Manual & Automated Accessibility Testing, a Match Made in Heaven

The implementation process for many AudioEye clients is a 100 Day Journey. There are many milestones that are reached along this journey, and an important one happens around day 60. This is when we have completed a majority of accessibility remediations and the site graduates to a second phase of accessibility certification. As a project manager, the 60-day mark is a significant date, one emphasized and planned for.

My journey with AudioEye has surpassed the 60-day milestone. This date came and went with little acknowledgment, and I find this to be a wonderful thing. I’m producing, learning, and finding each day to be more eventful than the next. So much so that I’ve lost track of time. In comparison to the long journey this company has already undergone, 60 days is just the beginning, but the amount of learning and growth that has happened in those 60 days is immense, both personally and as a company.

For this post, I was lucky enough to steal some time from our VP of Technology, Jeff Jones, to discuss AudioEye’s updated Digital Accessibility Platform (DAP), an accessibility report from the UK government, and the gap between manual and automated accessibility testing. Jeff was patient through my questions, and explicitly answered each with terminology and information that, I’m happy to report, I understood. Self-five. Here’s a synopsis of what I learned:

Recently, AudioEye launched an update to our Digital Accessibility Portal (DAP). DAP has been a fundamental tool for our company, and this update allows it to test every element for every WCAG success criteria. The updated system then prequalifies potential WCAG errors for a manual tester, allowing the tester to review the elements and assess the issues.

That's what we're trying to do: reduce the burden on the human tester by using automation to support them as much as possible; to pre-qualify the leads and get their efforts to be more focused on solving problems rather than identifying problems.

Jeff Jones

Around this time, the UK Government created the most inaccessible website they could develop, including 143 Web Content Accessibility Guidelines (WCAG) failures, or as they refer to it, the world’s least-accessible webpage. The site was tested with 10 free automated testing tools commonly used by developers and accessibility auditors. Their published findings include how each tool performed. Out of the ten they tested, the top tool correctly identified 38% of intended accessibility barriers. Their report presents a lot of numerical analytics, but what I found to be the most impactful was their conclusion (something that’s discussed often in the accessibility world):

Our opinion of automated testing tools is the same after the audit as it was before. We think they are very useful and should definitely be used by teams to pick up issues. But also that they cannot be relied on, by themselves, to check the accessibility of a website. They are most effective when combined with manual testing.

Mehmet Duran

The UK Government’s audit of automated accessibility tools provides a tangible test and quantitative analytics about competitors in the accessibility space. We were eager to see how AudioEye’s system would match up. Testing the world’s most inaccessible webpage? Count us in.

We set up our tool to test the UK Government’s crafted webpage and looked at DAP’s findings both with, and without, the update. Our legacy tool clearly identified 40% of the accessibility failures presented. That number makes me want to high-five our team of developers. AudioEye has been consistently performing with the top dogs of web accessibility.

Then we assessed our new interface, the one that tests for every WCAG success criteria. We correctly identified another 12-13% of issues, improving our total issues reported to over 50%. That number makes me want to high-ten our team of developers. Jeff, our VP of Technology, mentioned that these numbers surpassed his expectations, “We’ve stumbled on something pretty powerful, and are getting better over time.”

The percentage bump reflects that AudioEye is successfully beginning to bridge the gap between manual and automated testing. Our update allowed us to correctly identify over half of the intended errors, and that’s not including the additional errors we were able to pre-qualify for the manual tester. DAP identifies elements as different levels of errors, ranging from known issues to elements flagged for review. In all these instances, we leverage every element based on WCAG success criteria so we have a comprehensive list of all the possible outcomes. Guided testing for a manual user will ensure we hit all WCAG success criteria, as well as begin to identify patterns for additional automation.

There are specific criteria that will always require a manual check. Best results stem from a combination of both manual and automated testing, so we harness them in a way that empowers automation improvement by pre-qualifying issues, identifying patterns, and archiving this information within our Digital Accessibility Platform. Utilizing this structure will not only ensure the manual tester is efficient and effective, but enhance DAP’s ability to automate accessibility fixes and make web accessibility more efficient, more accessible, and more usable.

As I enter the final weeks of my initial 100 Day Journey with AudioEye, I feel pleased to report that myself, the AudioEye team, and our technology is continually striving for new phases in our digital accessibility journey. We’ve achieved many milestones, and have plans for many more!

Works Cited

Duran, Mehmet. "What We Found When We Tested Tools on the World’s Least-accessible Webpage." Accessibility Blog. GOV.UK, 24 Feb. 2017. Web. 9 Mar. 2017.

Maxie Adler

Project Manager at AudioEye, Inc.

Tucson, AZ

Subscribe to the AudioEye nClusion Newsletter

Get the latest industry news & insights delivered right to your inbox.

  or subscribe via RSS with Feedly!