Back to blog

Why Manual Testing Equals Smarter Automation

Posted December 05, 2022

AudioEye

Posted December 05, 2022

A stylized webpage that shows a number of accessibility issues, next to an icon of gears.
A stylized webpage that shows a number of accessibility issues, next to an icon of gears.

Automation is a critical part of solving digital accessibility at scale, but it works best when backed by human experts.

When it comes to digital accessibility, there are a few different approaches.

Some organizations value the speed and convenience of simple, automation-only tools, which promise to deliver accessible websites without the need for people.

Others prefer manual audits, which often involve certified accessibility experts going through each line of code on a website. However, these audits take a long time, cost a lot of money, and don’t even solve the issues. Instead, they provide organizations with a long list of issues to fix by hand.

Worse, manual audits only measure a website’s accessibility at a point in time. Any new accessibility issues won’t be detected until the next audit, which can leave organizations exposed.

At AudioEye, we think the answer lies somewhere in the middle.

There’s an obvious need for the speed and scalability of automation, which can help organizations keep pace with the size and ever-changing nature of the internet. But it’s also important to recognize that automation alone cannot ensure digital accessibility. The technology continues to improve, but the role of people in solving issues today and shaping smarter automation should not be discounted.

In this post, we explain why automation is necessary to help solve digital accessibility at scale — and make a case for why it works best when backed by human experts.

Organizations Everywhere Are Leaning Into Automation

In a Deloitte survey on AI and automation, nearly 50% of respondents said their organization was “deeply involved” in automation, with 24% using AI to help them perform “routine tasks.”

There are plenty of “routine tasks” in digital accessibility, like making sure that every field has a label or confirming that HTML heading elements are in the correct order.

Left unchecked, these are the kind of accessibility issues that can negatively impact the user experience for people with disabilities. But they’re also relatively easy to find and then fix with code, which is why automated testing and remediation is such an effective first line of defense.

At AudioEye, our automated solution can detect about 70% of common accessibility issues — and automatically resolve about two-thirds of them. All of this is done before the page loads to help you deliver a user experience that’s seamless for every visitor.

Our patented technology delivers more than a billion remediations daily. Unlike automation-only solutions, however, we recognize that there’s still a large number of issues that technology alone cannot solve. That’s why we rely on manual testing to help us deliver the highest level of accessibility.

A stylized web page, with an icon of a person on the left side and a magnifying glass with an accessibility symbol on the right side.

How Does Manual Testing Improve Automation?

Whenever we manually test a website, we use the insights from that audit to help us develop new automated fixes and solve more issues proactively.

Here’s how it works:

1. Initial Scan

After the initial automated scan, a developer remediates any issues found in the scan that could not be automatically fixed.

2. Manual Test

Using a screen reader and keyboard navigation, an accessibility tester goes through a testing workflow to identify additional accessibility issues.

3. Remediation

A developer fixes any additional issues found by the accessibility tester, then returns the issues to the tester to be re-tested. Steps two and three are repeated each time a remediation round is completed.

4. Last Pass

Before changes are published, a developer checks the site to confirm everything is working as expected.

By using native software, we can code custom remediations into our automated solution, adding custom fixes for websites without affecting the source code.

As an added bonus, this process often reveals patterns that can be turned into automated solutions that are deployed across all customer websites.

That last part — the ability to turn custom fixes into default automations — is why manual testing is so important.

It’s not just about spot fixes, or applying human judgment to something subjective like alt text; it’s about making sure that end users are able to help inform smarter automation over time.

A purple accessibility symbol, with an icon of a person on the left and a gear on the right.

Automation Is Here To Stay, But People Aren’t Going Anywhere

Automation is getting better at doing human jobs, but it’s not ready to stand on its own. People still need to be involved, whether it’s someone determining if a line of alt text is descriptive enough (and not merely confirming the presence of alt text, like an automated solution would do) or watching a video and making sure that the audio description matches what’s happening on screen.

Want to learn more about AudioEye’s hybrid approach to digital accessibility? Check out our white paper on Building for Digital Accessibility at Scale. Or, get a free scan of any URL to see how accessible your website is today.

Share post

Topics:

Keep Reading