PODCAST: Digital Accessibility Principles with AudioEye CTO, Sean Bradley, on Device Squad Podcast | AudioEye

PODCAST: Digital Accessibility Principles with AudioEye CTO, Sean Bradley, on Device Squad Podcast

Co-Founder & CTO at AudioEye, Sean Bradley, was interviewed on the Device Squad podcast about how AudioEye enhances the user experience to help make digital content more accessible for those with disabilities.

Device Squad, the podcast for the mobile enterprise, is a production of Propelics, an Anexinet company.

You can also find the podcast on the Device Squad website, or subscribe on iTunes or Android.

[DOOR BELL]

Sean Bradley:            This is Sean.

Steven Brykman:      Hey, Sean. How you doing?

Sean Bradley:            I’m doing great. How about yourself, Steven?

Steven Brykman:      I’m doing good.

Sean Bradley:            Where you calling in from today?

Steven Brykman:      So, I’m just outside of Boston.

Sean Bradley:            OK.

Steven Brykman:      Where are you at?

Sean Bradley:            We’re in Tucson, Arizona.

Steven Brykman:      So welcome, to Device Squad. Thanks so much for doing this with me. I just love your whole mission of AudioEye, and to make the web more accessible. And well, we’ll get into more of how you’re doing that. So, folks, thanks for hopping on and tuning into another episode of Device Squad, the podcast for the Mobile Enterprise from Propelics.

Device Squad, fighting crimes against enterprise mobility, worldwide. Bad UI, frustrating user experiences, poorly conceived mobile strategies, we defeat them all. This podcast series will cover all aspects of enterprise mobility, including strategy, development, design, testing, and more. My name is Steve Brykman. I’m a mobile strategist and UX architect with Propelics.

First, a brief history of the company. Founded in 2011, Propelics, is a mobile strategy consultancy helping enterprises worldwide devise true mobile strategies and develop world class mobile applications across all industry verticals. Propelics’ clients include large companies like Amway, Bright Horizons, Bank of Montreal, Dubai airports, Family Dollar, DLL, Cintas, Merck, and many more.

Propelics menu of services includes eight different mobile kick-starts, covering everything from mobile strategy and road mapping, to MCOE, to UI, UX design, to mobile testing strategy, along with soup-to-nuts app design development and support. Additionally, Propelics offers three homegrown Enterprise Mobile products.

My guest today is, Sean Bradley. And Sean, is the founder and CTO of AudioEye, which is an accessibility company. So we’re going to be talking today about how AudioEye is able to transform the experience of digital content for users with special needs and certain disabilities, to open it up for the broadest audience possible.

Sean, if you could just talk about your own personal digital journey, and how you came to found AudioEye? And what it was that made you so interested in this subject?

Sean Bradley:            Yeah. You know, so at the time, in the early 2000s frame, right? My brother and I were starting up some digital companies specifically centered around, you know, streaming media. And really when that was in its infancy, we were looking to provide businesses with a resource for posting video content online. And then, we kind of started dabbling in mobile marketing and other different types of start-ups.

And around that same time frame, 2003 or so, my brother was diagnosed with a degenerative disease in his right eye. And, you know, coming back from that appointment, you know, he started thinking about what it might mean to lose his sight. And what impact that would have in his, you know, his ability to engage with the types of technologies that we were building at the time. And, you know, after doing some digging, we came to learn that in a lot of ways what, we were building was inaccessible.

And it started us looking at the different ways in which a blind individual engages with the digital world. And we came to understand a little bit about screen reading technologies, specifically. You know, there weren’t too many, at that time, that were pervasive in the marketplace. And JAWS is one that stood out. It’s really a Windows based solution put together by the folks at Freedom Scientific that enabled access for blind individuals.

And, you know, as we started to dive into those types of technologies, and learn more about assistive technology in general, we got to thinking about how might we build a solution that is browser based and could be kind of included within a web environment, specifically a website. And how we might accommodate users that are accessing the web by providing free access to an assisted utility that emulated a lot of the functionality that was included in the JAWS solution, which is, you know, fairly expensive, and created somewhat of a burden for individuals with disabilities trying to pay a software license to access the digital world.

So that’s kind of where derived. It took some time for browsers to catch up to the idea, and for us to kind of realize our vision. Nearly a decade had to go by before modern browser technology was up for the challenge. And leveraging things like HTML5 and moving away from Flash, and other solutions that were inherently inaccessible. But you know, we were able to really start to create a technology that could be included in websites. And that was really just the beginning. Right?

Which we, through a lot of learning, and listening to the marketplace, and understanding the needs of individuals with disabilities, we were able to kind of evolve and understand that not only do we want to create a technology that could be included in a website experience, but also, make websites accessible. So, as we kind of dove into the issues at hand, it became apparent that if a website isn’t developed appropriately and designed with, you know, those standards I mentioned– everything from universal design principles, to conforming with web content accessibility guidelines– if those things aren’t adhered to in the design and development process– regardless of what assistive technology you might bring to the user experience, such as JAWS and other screen reading technologies that have come about since then the experience is abysmal.

And it’s really up to designers and developers to accommodate those use cases and make strides at conforming to those guidelines to ensure access for those individuals with disabilities that are bringing their own assistive technologies to the user experience. So that’s where we started. It really centered around creating an assistive technology. And then, it grew into really a path where we were very much focused on working with designers and developers and creating an overlay technology that would allow us to, on our customers behalf, update and enhance the user experience for those users that are relying on assistive technologies to access the digital world.

Steven Brykman:      You know, it’s so amazing to me that– you’re listing these dates– it’s so amazing that all this stuff is happening so recently. I realized that 2000 is now almost 20 years ago. But it’s pretty amazing when you point out that browsers, as transformative and pervasive as they are in our lives, that they just weren’t set up to accommodate these kind of shifts in design, and layout, and transformation and interpretation of data into alternate forms. And, like you said, not until HTML5, and so forth, and getting rid of Flash.

I like to, what I call, a Welcome Back, Kotter moment, up front. I don’t know if you remember that show. But there was always a couple of minutes up front where Gabe Kaplan would make some he’s do do a little shtick. And it would somehow relate to the topic of the show. So I just wanted to tell you, that here at the house, we have this flock of chickens that we primarily got for my daughter as a therapy animal because my wife and I are allergic to fur. And that’s not a joke.

But the chickens have kind of taken on a second life. They certainly have worked as a therapy animal. And I also use them as almost a mascot for Propelics. Because who doesn’t love some adorable chickens running around? But the thing about these chickens, is that they really hate the heat. It’s like 93 degrees out today.

And the chickens are really uncomfortable right now. They’re like walking around with their wings stuck out. And they’ve got their beaks open, they’re panting. And they can’t– they can’t deal with the heat. Which is funny because everybody wonders how the chickens survive through the winter? That’s all they want to know about, is how are they able to just live in this coop that has no insulation when it’s really, you know, below freezing for the whole night, potentially.

But the truth is, they love it when it’s cold. They have no problems regulating their own body temperatures. When it’s cold, they can like expand their feathers and stuff. And it’s really the heat that they can’t deal with. So the idea is that we really don’t understand somebody else’s problems unless we really experience it for ourselves, or we really see the world through their eyes. Which in a very literal sense, relates to what you guys are doing. So that’s just my anecdote for today.

Sean Bradley:            Yes. I love it. No. It couldn’t more than accurate. Right? I mean, um, absolutely, it’s really about empathy at the end of the day, and trying to put yourself in other people’s shoes. And, you know, by doing that, you can begin to understand and take steps to accommodate those individual needs that you may not have contemplated before. I mean, I certainly appreciate your analogy there.

Steven Brykman:      Right. And just the idea that what we think somebody needs, may be a complete misconception. So why don’t we start at the very beginning. Why don’t you start by just defining what accessibility is.

Sean Bradley:            Perfect. Yeah. And you know, first off, we want to just say once again, thank you for this opportunity today, Steven. We, you know, are very grateful for being able to speak with you today, and share the story of accessibility a little bit. You know, I think it really comes down to the ramps and rails to the digital world. Right?

Providing equal and unimpeded access to all things digital. And really making sure that we aren’t discriminating against anybody with individual needs. Each individual has their own experience. Because typically, as it pertains to disability, there’s a countless number of disability use cases that need to be taken into account. And accessibility is really about making sure that we are taking steps to accommodate all users, all individuals, regardless of their individual abilities.

Steven Brykman:      What sorts of disabilities are you looking to specifically accommodate?

Sean Bradley:            You know, when it comes to accessibility, one of the ones that tends to come up first and foremost is blindness. You know, that use case where an individual is needing to be provided access, and isn’t able to engage in a way that is most familiar for so many of the rest of us. And that is very much at the top of the list, when you think about accessibility. It’s very common for individuals to kind of think about a blind user and how they might engage with the digital world.

But by no means, is that where it ends. That’s just the tip of the iceberg. When you think about other disabilities, there’s an unlimited number of them. Low vision users– obviously, users that are looking to magnify the visual experience and adjust the display of that environment that they might be engaging with. To blow it up and have a closer view into their viewport.

Color blindness is another one, very common one. You know, there’s lots of considerations in accessibility and making sure that digital environments can conform to the necessary standards that would accommodate individuals with colorblindness. Which we can kind of go into a little bit. Certainly deaf and hard of hearing is another one. Just the idea that a user is going to need to read and be provided access to information that might be relayed through media content. So videos need to be captioned and transcribed.

Cognitive disabilities. And coinciding with that, learning disabilities– individuals that are looking to have some level of hand-holding, or assistance from the perspective of making content more digestible, more readily understood by simplifying content and providing access to content in different ways. Providing multiple paths to understanding and accessing content is critical. Making sure that the web copy and the way things are being described within an environment isn’t too complex.

Making sure that information is conveyed in multiple ways. You know so if you think about an icon. An icon can be very subjective. And everything from a very common thing like a hamburger menu, that might provide access to the menu of a website, that may not be readily understood. So it’s very nice and appropriate to accommodate those users with– when providing something as simple as an icon– providing the text equivalent. So that way they can read and understand what that icon means, or what it symbolizes.

So those are the kind of a rudimentary example of how digital content needs to accommodate different use cases. Another one would be autism. Simplifying content. Reducing noise. Reducing the clutter that can come with websites, and all of the distractions that come with advertisements, and things along those lines. Providing a way for users to kind of customize that experience is critical.

And certainly motor and ambulatory disabilities. You know, thinking through the use case of users that might not be accessing by using their mouse. And providing keyboard access, or certainly making sure that the web environment can accommodate users that might be using sip-and-puff technologies, and other types of assistive technologies. You know, through all the different disability use cases, users might be coming to your user experience using their own assistive technologies. And it’s important to accommodate those different use cases and making sure that steps are being taken to optimizing user experience for all individuals, regardless of their individual ability.

[MUSIC PLAYING]

Steven Brykman:      Can we just back up a little bit.

Sean Bradley:            Sure.

Steven Brykman:      Get a little more history. If you could give us a little more context around what the Americans with Disabilities Act is. And how that helped create this movement?

Sean Bradley:            Absolutely. So it enacted in 1990, and signed by Bush Sr. It’s a Civil Rights Law– Americans with Disabilities Act is a Civil Rights Law that prohibits discrimination against individuals with disabilities. One of the authors of the Americans with Disabilities Act, I’m proud to say, is Tony Coelho. He’s on our board of directors. He’s very active. And he has been very active for years now, in helping us build our company. He is just a, you know, he lives and breathes accessibility.

He was instrumental in authoring this very important document, which has really paved the way for a countless number of international laws and regulations that are really derivatives of the ADA. Everything from, in Ontario, Canada, the Accessibility for Ontarians with Disabilities Act, which was enacted in 2005, also referred to as AODA. The Equality Act in the UK. The Web Accessibility Standard in New Zealand. Disability Act of 2005, in Ireland. And countless other examples of laws and regulations that are really intended to prohibit discrimination, and really focus around making sure that individuals with disabilities are accommodated.

And kind of going back to your chicken analogy, it’s really centered around empathy and putting yourself in other people’s shoes to ensure that as we build things, and as we provide access to individuals, that we accommodate the needs of all potential individuals, regardless of their abilities.

Steven Brykman:      So were these new accessibility guidelines, were they mandated? Or was traditional digital interfaces, were they seen as being discriminatory based on being only usable for those without disabilities. Or how did that work?

Sean Bradley:            Sure. So the ADA really came about prior to the proliferation of the internet. And it really was centered around physical establishments. Right? You know, if you think about ramps and rails, it’s really all about providing access, ensuring access is unimpeded as users are coming in and out of buildings, or trying to get accessible access to restrooms, and drinking fountains, and other public accommodations that are made available.

And obviously, as we transition from the physical to the digital world, the ADA was authored in a way, that would kind of anticipate the changing world. And it would allow us to ensure, that as we create new technologies and things on those lines, new ways of accessing the world, individuals with disabilities aren’t left behind.

Title III, specifically, within the ADA, really prohibits discrimination for any entity that is providing a public accommodation. So retailers and restaurants need this to take into account the level of access that they’re providing. So these retailers, for example, transition from physical buildings to providing access to buy goods online. The ADA was broad enough, and Title III specifically, was broad enough to accommodate individuals with disabilities in those scenarios as well.

So Title II of the ADA, specifically, refers to public entities, and insures access for school institutions and other similar bodies, to accommodate individuals with disabilities seeking access to those environments. So, you know, it’s really essential that the way in which the ADA was authored, regardless of how we evolved as a society, that individuals with disabilities were not left behind. And it’s thanks to the structure of the ADA that we’re able to prioritize accessibility into the future.

Steven Brykman:      And when it comes to the digital world, and the enterprise, like who’s the responsible party to ensure that these requirements are met? Is it the software developer? Or is it the enterprise that’s using the software? Or is it some combination?

Sean Bradley:            Yeah. Good question. Right? It’s really the publisher of the digital content. You know, whoever kind of owns and operates that environment online is really responsible for making sure that the information provided, and the services provided through that environment, are unimpeded.

And it’s common for digital environments to incorporate information from different third parties and resources. So if the information being provided through those environments is really linking to others, the information and services provided via other third parties, it’s really the host and operator of that environment to ensure that those third party solutions are also made accessible. Or that they’re linking to, or relying on services that are also made accessible.

So, you know, it’s kind of a mixed bag. Right? It’s not only the information that is hosted and maintained by the operator and publisher of content. But also, taking into account that the many third parties that might be integrated with that environment. So it’s a non-trivial ask to ensure that the owners of these web properties are taking into account the many nuances and details of where that digital content is really being hosted and maintained, and in the different vendor supplied services that might be critical to that digital environment.

[MUSIC PLAYING]

Steven Brykman:      Is there an established set of rules around the UI, UX around accessibility? Like, is there a standard? For instance, font sizes need to be scalable up to 48 point size, say. Or colors need to be manipulatable to a certain contrast. Things like that. Does a set of guidelines exist somewhere?

Sean Bradley:            There is. And first and foremost, there’s really Universal Design Principles. Right? So Universal Design consist of a set of seven principles that were essentially developed and designed right around the 1995-ish time frame, or shortly thereafter. And is really architects and product designers that are responsible for kind of establishing a set of guidelines.

And it’s because of the Universal Design that we have things like auto-opening doors, or drinking fountains that aren’t too high to accommodate an individual sitting in a wheelchair. And things along those lines. Right? Ramps and rails, curb cuts– those types of Universal Design Principles have been deeply integrated in the physical world.

And when you apply that to the digital world, you know, you’re absolutely right. It has to do with the ways in which web copy and font is provided, accommodating color contrast. Right? So making sure that the text that’s in the foreground isn’t displayed in front of a background that has a color palette that just isn’t conducive to making that web copy legible. So it’s those types of things that are really derived in the Universal Design Principles.

And further, there’s what’s called the Web Content Accessibility Guidelines. A set of internationally recognized standards and testable success criteria that allows designers and developers to benchmark against a set of guidelines and principles that help them kind of accommodate individuals with disabilities and the different disability use cases that need to be considered when publishing digital content.

And, you know, these techniques can be deployed when creating digital content to make sure that the various disability use cases are accommodated and taken into account. And you’re essentially– the testable success criteria is essential, and has really become the gold standard for ensuring that the ADA is adhered to. You know, one’s obligations under ADA are conformed with through an adherence to those standards.

Steven Brykman:      Thanks. You know, I’ll be sure to provide some links to those Web Content Accessibility Guidelines and Universal Standards in our Propelics blog for this episode. Apple, has sort of been a champion of accessibility. So these days, like, how would you say that Apple’s features compare to Windows? Like, are they required to offer the same functionality in those terms? In terms of accessibility and what they enable, you know, someone with special needs to be able to do with their data?

Sean Bradley:            Yeah. Yeah. It’s a great, great question. Right? And Apple, absolutely, is a leader in accessibility. Microsoft, as well. Google and, you know, and some of the leading technology companies in the world have really come to understand that there is a vast number of individuals with disabilities in this world.

Google commonly refers to the one billion individuals with disabilities in today’s world. And obviously, wanting to create technologies that don’t impede access for anyone. And certainly adhering to their obligations under the ADA, and all the international laws out there, you know, they certainly realize that they needed to make strides of their own to accommodate individuals with disabilities. And they’ve done a great job.

Apple, in particular. Right? With their VoiceOver technology. So today, when you buy a Mac, or any device with the iOS or OS operating systems, it has built in screen reader technology, and other assistive technologies that have been built into the operating systems. Microsoft, now provides the Windows Narrator. And certainly with Android devices, they provide the TalkBack screen reading technology, and others assistive technologies that have been built in. And these have been essential. Right? These are essential tools to accommodate individuals with disabilities.

And where we kind of come in is with the web publishers that are providing, or building, and designing, and publishing, websites, and web applications, that are intended to be accessed through these operating systems. And specifically, through the built in browsers, or the types of browser technologies that you can download and install onto your operating system. If those web publishers don’t take into account those Web Content Accessibility Guidelines, and the different Universal Design Principles and whatnot, regardless of how powerful and beneficial those built-in technologies, like VoiceOver, and TalkBack, et cetera, are the user will still be left behind.

So it’s really up to the web developers and web designers to make sure that they are doing their part to ensure that the content and the functionality that they’re making available, the different programs and services that they make available online, have accommodated those different technologies that are kind of built into these operating systems. Or the types of assistive technologies like JAWS, and NVDA, and other types of assistive technologies that are essential for ensuring access and enabling access for individuals with disability.

[MUSIC PLAYING]

Steven Brykman:      So your Digital Accessibility Platform– it sort of sits on top of all that. And it serves to make it more available, or it enables customization of content that wouldn’t otherwise be customizable, or how does it provide like, a supplementary benefit?

Sean Bradley:            Yeah. So our Digital Accessibility Platform is really the tool– it’s really two-fold. Right? It’s a tool for designers and developers. They can use the Digital Accessibility Platform to help them test and evaluate digital content, specifically in relation to those Web Content Accessibility Guidelines I referred to. Specifically, there’s– referring to it as WCAG. W-CAG. There’s a 2.0, and most recently, a 2.1 version of those standards that provide those testable success criteria. And the Digital Accessibility Platform provides a window into those tests that can be conducted. Some of which allow the developer and designer to gleam information about the accessibility of their content through automation.

So from an automated standpoint, tests can be run, and the results of those tests, can provide information about how accessible the environment is. It’s also a platform in which manual testing can be conducted. It provides guided testing for manual testers. And manual testing is an essential process for any digital content provider where subject matter experts, and users of assistive technologies– in particular, individuals with disabilities– are using these assistive technologies to test content. And they’re maintaining an audit of that environment using the Digital Accessibility Platform.

So the Digital Accessibility Platform that we provide our customers is essential for maintaining those audits. And it’s really, most importantly, the tool– the back end tool we use on our customers behalf, to provider our managed service, which we call Ally. Our Ally managed service allows us, AudioEye, to do the work of testing and even remediating digital content on our customers behalf. It allows our customers to install an overlay technology.

By installing a script into their website, it empowers AudioEye to control the level of accessibility of that environment to make it accommodate those assistive technologies that we’ve referred to, such as JAWS and others– VoiceOver, and TalkBack, and the different assistive technologies that empower access– enable access for individuals with disabilities. We can use our technology, or our overlay technology, to dynamically fix issues of accessibility that have been identified within the Digital Accessibility Platform, and have been also identified through a manual testing process that we conduct our customers behalf.

So it’s really a very great way for our customers to essentially outsource the accessibility solution. We provide the Digital Accessibility Platform as a tool to empower our customers to do it themselves. But a vast majority of them are looking to outsource. They’re looking to say, hey, this is a big undertaking, trying to maintain the level of accessibility of our web environments. And it requires subject matter expertise.

And by simply allowing AudioEye to control and maintain accessibility on their behalf, it just makes a lot of sense. It’s more cost effective. It’s more practical. It provides speed to compliance. Because it can take a lot of time and effort to bring internal resources up to speed. To educate them about the nuances of accessibility, and all the rules, and testable success criteria, et cetera. Whereas, if they can just embed a script, through the AudioEye solution, AudioEye can do the vast majority of the heavy lifting that is required to achieve compliance, and achieve substantial conformance to those standards that we’ve discussed.

Steven Brykman:      Sure. That makes a lot of sense. Now you mentioned JAWS a number of times. Can you just explain what that is? And define what that is in a little more detail for folks?

Sean Bradley:            Yeah, definitely. So JAWS is very similar to VoiceOver, which is very similar to Narrator, and TalkBack. These are all screen reading technologies. And what they do is they interpret the content that is displayed within a website, or a web application, and they read aloud the content.

So it allows the user to listen, just like content. It allows them to use their keyboard and other technologies to navigate by listening. It converts text to speech. And in real time, allows the user to kind of engage and interact with a digital environment that is typically designed for a visual use case. Right?

So for a user that is typically using a website, is going to have a visual experience, and they’re going to use their mouse, and they’re going to navigate around. And with the screen reader technology like, JAWS does, it allows a user without sight, or with low vision, to interact with that environment, by converting text to speech, reading it aloud, and providing a series of very sophisticated key commands to allow the user to jump around and navigate that environment by listening and controlling that experience through a different paradigm.

Steven Brykman:      Oh, I see. That’s really cool. And so what does accessibility testing look like? Does it give a web app, or website, a score? And does it score it on certain criteria? Does it look for text variability, and text to speech, and colors and so forth? What, could we just–

Sean Bradley:            Yeah. Great question.

Steven Brykman:      –get into that a little deeper?

Sean Bradley:            Yeah. Sure. Sure. So accessibility testing– a portion of it can be done through automation. Approximately 30% to 40% or so. And that means, programmatically, tools like the Digital Accessibility Platform, can evaluate a website or web application. It can kind of parse through the different pages within that environment. Identify the different templates. And run tests against that testable success criteria that’s provided through the Web Content Accessibility Guidelines.

And resulting from those test, it can say, OK, we understand that these are issues that have been programmatically identified. And they’re going to need attention. The designer and developer is going to need to change the way in which they’ve presented this information, and the different elements that exist within that environment, to accommodate the best practices.

So, you know, again, about 30% to 40% of the issues that might pertain to those guidelines can be identified programmatically. And then, the other remaining issues that might exist, need to be identified through manual observation and through manual testing, which is typically conducted by subject matter expert. And in a lot of cases, that subject matter expert is going to use an assistive technology such as JAWS, or those different types of screen readers, to evaluate the content.

They’re going to spend time interacting with that environment, using those tools. And then, record their findings within the Digital Accessibility Platform, or other type of solution, that allows them to maintain and audit, and essentially record the different steps that require attention, allowing the designer or the developer to understand the different issues. And then go make changes to better conform to those standards.

Steven Brykman:      Great. Thanks. And when it comes to the enterprise, where do these accessibility guidelines play the biggest role? Like, where do you see them being used the most?

Sean Bradley:            You know, any website, web application, mobile environment, any digital interface, right? Needs to conform to some level of these standards. The different types of issues that are our common is somewhat unique from one environment to another. It really just comes down to the understanding that the designer or developer had when they were creating that content, or that environment. And, you know, if they didn’t have any understanding, or didn’t take any steps to accommodate individuals with disability, it’s likely that they’ve taken shortcuts, or haven’t really created that content with a full sense of the important steps that are required to ensure access.

So, you know, it varies from one environment to another, the types of issues that you find. But common issues pertain to the ways in which a web form has been created. If the web form isn’t labeled properly, a user that’s using assistive technology to engage with that web form won’t necessarily be provided the necessary information to engage with it.

So for example, if the web form provides a visual indication of what the web form is asking of the user– example, their first name, or their address, or email address– if the web form isn’t developed in the appropriate way, the screen reader technology won’t understand what the web form is intending. And won’t convey the information to the user. So the designer and the developer has to take steps to accommodate that use case.

Another example, would be images. Very common for an image to be uploaded to a website without a description, a text based description. That would then, allow the screen reader technology to convey a verbal description of what that image is conveying. And everything from website carousels, to inputs and buttons, and all of the different types of elements that you might see within the modern web, have to be considered when developing and designing for individuals with disabilities.

[MUSIC PLAYING]

Steven Brykman:      So one of your other products, or maybe this is one aspect of your product, is the Ally Toolbar. Could you talk about that? And how that relates to your overall product offerings?

Sean Bradley:            Yeah. So the Ally Toolbar is really, you know, the cornerstone of our offering. Right? It’s really the original idea behind the company, is what has become the Ally Toolbar. And the Ally toolbar provides what we call, The Little Man In Blue. He’s a set of tools, assistive utilities, that enable access, or allow users to personalize their web experience.

It’s typically provided as an icon in the bottom right hand corner of our customers websites. It’s really the on ramp to a set of tools that allow users to customize the user experience. Is what we’ve done, is look at the different assistive technologies that are in the marketplace. Everything from JAWS, and screen magnifiers, to other different types of tools, and provided similar levels of utility through a series of options that are made available to the user.

So for example, they can engage a player utility. And the player emulates the screen reader technology. It allows the user to use a keyboard to navigate around. It converts text to speech. And really emulates that screen reader experience. There’s a Reader Tool. The Reader provides access to enlarge the visual display for low vision users. It simplifies the user experience for individuals with cognitive and learning disabilities and autism.

It allows the user to adjust the color contrast, for individuals with color blindness. And it allows for the user to manipulate the fonts. There’s a specific font that’s been known to assist individuals with dyslexia, and provide just different options, allowing the user to customize the user experience.

There’s a utility that simplifies the menu structure. There’s a tool that provides more options for keyboard users, allowing them to quickly navigate from different elements. And there’s also a voice centric tool that allows the user to actually speak verbal commands. And through spoken command, control the browser as if they were using their mouse, or their keyboard.

And so by saying things like, go back, go home, scroll up, scroll down, the user can manipulate the browser experience within that website, using those tools. And, you know, it’s really just the icing on the cake. Right? It’s a set of tools that that’s there to accommodate different disability use cases, but it’s really on top of everything else we’ve done, to fix the website to conform to those standards I’ve been discussing. And fix the website in a way that maximizes and optimizes the performance of that website for users that are coming to the web environment, using their own tools, such as JAWS, and others.

Steven Brykman:      Yeah. That it’s just so awesome. So, I’m a little confused about– are browsers required to incorporate these kinds of settings and preferences? And I guess I just don’t really understand why they wouldn’t be required to do so.

Sean Bradley:            Yeah. So, I mean, the browsers are really just a gateway into web content that’s been published by web publishers. So the browsers themselves, continue to get more sophisticated. Different browsers have different options to accommodate disabilities. You know, there are some that have a reader tool kind of built into them. Safari, you know, there’s the option to kind of simplify the display. There are certainly options in almost all the browsers, I think, to enlarge and decrease the size of the visual display.

And, you know, so there are built in tools, absolutely. And they differ from one browser to another. There are different commands to control those, you know, from a keyboard perspective. Or trying to find where to– from a pull down menu, where to engage those types of tools.

So, with the Ally Toolbar, you know, we’ve normalized it. Right? So regardless of what browser you’re on, or regardless of what AudioEye enabled website you’re on, you’re going to have a consistent experience. They’re going to be a set of tools there that are really just there to accommodate your user preferences. And they’re going to be consistent across all browsers.

So, whereas, the browsers are doing a better and better job of accommodating different disability use cases, and providing different options, they are always kind of presented in ways that are unique from one browser to another. And the idea with the Ally Toolbar is to realize that, and provide a consistent user experience, regardless that that’s essentially the browser agnostic.

Steven Brykman:      Yeah. That makes a whole lot of sense. I mean, especially when you’ve got somebody who’s already facing these barriers, to digesting content now, ways to get around that being different, depending on what browser, I can see that being totally frustrating. And so, your solution is– I mean. It’s cross-browser. It’s also cross platform. Right?

Sean Bradley:            Exactly. Exactly. So as you transition to the mobile environments, our focus is really, again, empowering the user that’s going to be using their VoiceOver technology, or their TalkBack technology, and making sure that the underlying environment that the user is trying to access is presented in a way that is going to properly communicate with those tools. So that’s our primary focus in mobile. VoiceOver, in particular, is really just a fantastic solution for individuals with blindness, and others that is essential for us to enabling access for the user.

So our primary focus in the mobile environment, is fixing that underlying code to accommodate those users. And then, of course, as an option, we provide option to our tools and also our Help Desk utility, which provides a means of allowing a user to report an issue, should they encounter one, all within a mobile environment, et cetera.

Steven Brykman:      Oh, so there’s also reporting built in, as well. That’s awesome.

Sean Bradley:            Yeah.

Steven Brykman:      You also talked about how AudioEye helps ensure compliance. You should talk about how it does that, and what it means to be certified by AudioEye.

Sean Bradley:            Yeah. Great question. Right? So I mean, it comes down to– compliance really comes down to providing unimpeded access, equal access, equivalent access. And for our customers that have enabled our managed service, we are able to certify them because, you know, first and foremost, we have their back. It means that, that organization, or business, has taken steps in making strides to recognize the importance of digital inclusion. And are committed to not only maximizing their conformance with the international standards and best practices made available through the Web Content and Accessibility Guidelines, but they’re also going to do everything they can to maintain that level of conformance.

And as the standards evolve and change, which they are– just recently, in June, the Web Content Accessibility Guidelines 2.1 came out, and were officially published. So the standards are changing. They’re going to continue to change. They’re going to continue to accommodate more disability use cases. The latest set of standards from 2.1 are very mobile centric. They’re very cognitive disability centric. Which it’s fantastic. There’s more consideration being taken into account.

And as those standards evolve, the AudioEye certification means that AudioEye is going to continue to adapt that environment, and make sure that digital inclusion is front of mind and kind of adhered to over time. And we’re going to continue to make incremental progress on maximizing our customers conformance. And that’s really what certification is all about. It’s to tell end users that are coming to that experience, or that web environment, or web application to say, digital inclusion is a priority. And it’s going to continue to be one because we were partnered with AudioEye, who is solely focused on making sure that we are meeting the needs of individuals, regardless of their individual abilities.

Steven Brykman:      Awesome. So you mentioned some more attention being paid toward cognitive issues. So what kinds of things is AudioEye looking at, in terms of some future features?

Sean Bradley:            So first of all, you know, as it pertains to WCAG 2.0, right? So there’s new testable success criteria. Lots of which has to do with making sure that content is made digestible and more readily understood. So there’s new tests that need to be accommodated. Right? Which means, there’s more work to be done from a testing standpoint, and potentially, from a remediation standpoint.

So as these new tests come about, there’s new things that can fail. And as new failures are identified, changes need to be made to that content, or that web environment. And whether it’s AudioEye making those changes, or for working with our customers to make sure that they’re considering these new standards, there’s just, essentially, more work to be done, and more considerations that need to be taken into account and adhered to over time.

Steven Brykman:      And are you looking at any other like, new technologies? Like, are you thinking about augmented reality? Or are you thinking about maybe using AI or machine learning to help with the automation of testing? Or any of those kind of the avenues?

Sean Bradley:            Absolutely. Machine learning is a big part of our focus right now. We’re continuing to use machine learning and artificial intelligence to help us gleam more information about the types of issues that can be programmatically identified, and also programmatically fixed. Right?

So if we can know that something is wrong with 100% certainty, there’s likely a good chance that we can also programmatically fix those things. So by using machine learning, and artificial intelligence, we can make this process more seamless and less burdensome. So that’s certainly a focus.

Another focus of ours is centered around our Voice utility and our kiosk solution. So we feel that with Voice, and enabling access for users using voice commands, that kind of transcends the traditional accessibility paradigm. And has a lot of utility in the same sense that speaking to Alexa, and other voice driven technologies, can enable access in a lot of meaningful ways. So as we look to proliferate more of our customer websites with our Voice technology, we want to make sure that that’s a great user experience.

And as it pertains to kiosks– the providers of kiosks, and certainly the software and the digital solutions that are provided through kiosk technologies, need to accommodate individuals with disabilities just in the same sense that websites and web applications need to. So there’s a big opportunity for kiosk providers to– and certainly there is a big challenge for kiosk providers to make sure that they are accommodating individuals with disabilities. And we look to bring our Ally manage service, and our Ally solutions to those environments as well.

So we’ve got some big undertakings as we look to expand the breadth and reach of our offering. And certainly a great deal of challenges to overcome as we assist our customers in making sure that whatever digital environment they’re creating, and publishing, and making available in the world, is made accessible. And presents all users with equal opportunity to engage and interact in meaningful ways.

[MUSIC PLAYING]

Steven Brykman:      Very cool. You know, you mentioned autism really early on, which was kind of surprising.

Sean Bradley:            We continue to consider autism in the technologies that we’re building. We’re working with the Southwest Autism Research & Resource Center which is also called, SARRC. And, you know, they are they’re big advocates. And we continue to work with them on trying to expand some of the capabilities of our tool set made available through the Ally Toolbar. But we are always kind of open to getting more feedback, and gleam more insights, in terms of how we can accommodate individuals, regardless of certainly, their individual needs.

Steven Brykman:      Great. So I’m going to throw one final curve ball at you. So my family suffers from Fragile X syndrome, which in males, manifests itself as autism, which my brother has. And in my younger sister, it manifests itself in a number of ways. The predominant symptom is anxiety for her.

In fact, she’s so anxious that she does not have a computer. She doesn’t have a cell phone. Because she’s worried that data is going to be taken from her by somebody, which actually, may not be all that crazy these days. But she literally has no online presence. She has no– she doesn’t engage in social media or anything.

Which bums me out, frankly. Because I think she could really benefit from it. She’s kind of a recluse. She’s really shy. She has trouble with social interactions, you know, face to face. And I think that something like, social media, could actually be a huge help for her, in helping her find her own community and so forth. Have you ever considered anything that might help someone like her just feel more OK with technology.

Sean Bradley:            Uh, you know, I appreciate the question. You know, I think, you know, I’d say that we have members of our technology team that have the same level of anxiety, and distaste for social media, and things along those lines. So I think we can empathize from that perspective. That said, you know, trying to provide information about the secure browser technologies, and kind of the best practices, as it pertains to engaging in the digital world, I think is kind of a good starting point.

I think, kind of education, from the standpoint of you can engage in a way that would make you feel secure. I can certainly appreciate a lot of the concerns as it pertains to security, and feeling vulnerable, and things along those lines. One of the things we’ve done at AudioEye is get our entire staff trained up on the different types of security concerns that one should have when they’re publishing contact, or when they’re checking their emails, and the types of phishing and social engineering considerations that need to be taken into account.

Because, you know, obviously, there are a lot of malicious ways in which a hacker and others might kind of keep one in a state of being paralyzed by the sense of the types of threats that are out there. And things along those lines. So I’d say it starts with education, in kind of just understanding what those risks are. And understanding how one can be manipulated through digital content. Everything from, like I said, checking your email to engaging on social media. But certainly a good area of focus and something that we can empathize with.

Steven Brykman:      Yeah. I think, it would– I mean, something like that, I think, would take a combination of like, security, as you mentioned. And then, also, some serious filtering. Because, you know, we all get spam emails, or emails from people who claim to have stolen our data, sometimes very convincingly. And if she got something like that, it would like just send her into a total panic.

Sean Bradley:            Sure. Sure. Understandable.

Steven Brykman:      It would have to be able to filter out all of that stuff and just prevent it from ever getting to her. But I think, it is viable, that a product like that could be created where it just claims to be a safer experience. But, who knows.

Sean Bradley:            No. Definitely. I love that area of focus. I mean, you know, like you said, it’s really an advanced filtering type of solution where your environment is controlled by you. And you get to select and manage your level of security. And kind of that filtering. So it would be a very fascinating path to kind of go down. And I encourage to keep noodling that. And I think we can be doing the same.

Steven Brykman:      Especially where some of these cognitive issues are sort of having more attention paid to them. And are being addressed more and more. Maybe we’ll see anxiety being one of those.

Sean Bradley:            Sure.

Steven Brykman:      Well Sean, thank you so much for hopping on. This was a great discussion. And congratulations on producing such an amazing product that’s going to help so many people.

Sean Bradley:            Thank you. It’s really an honor to be part of this. And just thank you for your time, and this opportunity today.

Steven Brykman:      It’s great. We do a lot episodes about like, Amazon Web Services, or serverless backends, and things that, you know, they improve productivity, and they make data safer, and more accessible. But what you’re doing is genuinely so much more valuable to society at large. So I just want to thank you.

Sean Bradley:            Thank you for that.

Steven Brykman:      And thank you, everyone, for hopping on to another episode of Device Squad. The podcast for the Mobile Enterprise from Propelics, an Anexinet Company. Take care.