Lessons Learned From AudioEye’s Community Testing Program
Get an inside look at AudioEye’s community testing program — who we partner with, how they help improve our QA process, and key takeaways from a year of working together.
In 2021, AudioEye launched a community testing program to help improve our QA process, inform our product roadmap, and create opportunities for people with disabilities.
The AudioEye A11iance Team is a group of people with disabilities who use assistive technology (AT) in their daily lives. Members work with our QA team to test AudioEye customer sites with custom accessibility plans and to test manual remediations in development. Their feedback and insights are invaluable in our work, giving us an end-user perspective of websites using AudioEye’s services.
Getting To Know the AudioEye A11iance Community
We started our recruitment process with screen reader use as the primary requirement. And because many of our new testers join by referral, it’s no surprise that the A11iance Team is primarily made up of people who are blind or low vision.
We also have testers who are hard of hearing, or have a disability, such as cerebral palsy or amputation, that may affect mobility.
Our members work in non-technical and technical fields — from customer service and patient advocacy to network security and AT training. We also have testers who are unemployed or underemployed.
Our community enjoys a wide range of interests and hobbies. We have readers and writers, musicians and singers, rock climbers, adaptive water skiers, tabletop gamers and coders.
What Types of Assistive Technology Does Our Community Use?
Our community members use a variety of assistive technologies to perceive, operate, and understand web content. Screen readers — with and without braille display — are most common, but we also have members who use Kurzweil to convert text to speech, live captions for those who are hard of hearing, and high contrast mode or magnification for those who are low vision.
Most Used Screen Readers
Because the majority of our community members is blind, we wanted to share screen reader usage:
- JAWS (42% of our community)
- JAWS and NVDA (37%)
- JAWS, NVDA, and Voiceover (8%)
- Voiceover (8%)
- Kurzweil (3%)
- NVDA (2%)
Key Learnings From Our Testing Community
Here’s a look at some key learnings from working with the A11iance team this past year:
1. We Have To Be Careful Not To Penalize People for Working.
No matter how enthusiastic someone is about working with us, we have to be mindful of certain financial implications for them. People with disabilities sometimes have limitations on how many hours they can work — or how much income they can earn — based on state and federal government assistance guidelines. We’ve learned to be extra considerate of this and work with each individual according to their availability.
2. Equity of Experiences Is Being Determined by People Without Disabilities.
We know that most of the internet is devised by people without disabilities and without accessibility as a key consideration. Website managers, designers, writers, and developers may not understand how to make all interactions accessible. Or, they may hide elements from users or do the bare minimum to support accessibility. For example, an image description, or alt text, that simply says “sweater” isn’t good enough when someone is considering purchasing it and wants to know the color, texture, or if it has any patterns.
We frequently encounter important information being hidden from screen readers, and missing or minimal descriptions or graphics, logos, and key photography. When this happens, people who are blind or low vision are left to wonder.
3. Lack of Support or Learned Behavior Influences How and When Issues Are Reported.
We frequently run into testers who have learned to make the best of inaccessible websites. Many of our community members have had negative experiences with customer service or IT teams who were unable — or unwilling — to resolve accessibility issues.
Poor user experiences all over the internet have also conditioned some of our community members to ignore or work around accessibility issues and not report them. For that reason, AudioEye requires user testing training, which helps ensure all community members use the same set of standards.
An Inside Look at a Moderated Testing Session
We give community members a chance to participate in moderated testing sessions of client websites. These sessions are either task-based (following defined paths through the website) or ad hoc (letting testers find every bug they can on the homepage of a site). Bugs are logged for remediation, and video clips — both good and bad — and made available to clients and AudioEye’s internal teams.
Here’s a brief transcript of a moderated testing session with one of our testers, who was asked to provide feedback on a cookie banner experience
“The way I got to that, by the way, is I jumped to the very bottom of the page. Most of the time that works. If I know I’m navigating behind a pop-up, I go all the way to the bottom [by hitting] CTRL + END.”
“I’m glad to see that worked. It’s not ideal, but it’s one of those workarounds you learn to live with.”
“Tell me what the ideal scenario would be?”
“The ideal scenario would be that I cannot get out of that pop-up until I dismiss it.”
What’s Next for the A11iance Team?
We’re continuing to grow our community and create more opportunities for people to contribute in moderated and unmoderated sessions. Our goal is to make sure there’s a program for everyone, however they want to or are able to participate.
Interested in joining the A11iance team? Click here to get more information.
Curious about your website’s accessibility? Click here to get first-hand feedback and insights from our community of expert testers.