Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event. 

Not content with improving its own platforms’ accessibility, Google’s releasing a new tool — Accessibility Scanner for iOS — that’s designed to make it easier to develop iOS apps that accommodate visually and hearing-impaired users. It’s available on GitHub in open source.

As Google’s Sid Janga explains in a blog post published today, Accessibility Scanner for iOS, or GCSXScanner, helps discover, debug, and fix common accessibility issues in iOS codebases. “App development can be a time-consuming process, especially when it involves human testers,” he wrote. “When a new feature is being developed, often there are several iterations of code changes, building, launching and trying out the new feature. It is faster and easier to fix accessibility issues with the feature if they can be detected during this phase when the developer is working with the new feature.”

GCSXScanner is an Objective-C library that sits in an iOS app’s process. Installing it overlays a tappable, draggable “Perform Scan” button atop the target app that kicks off a scan for bugs. GCSXScanner principally uses GTXiLib, a library containing a number of built-in checks for accessibility issues, but also supports an extensible plugin framework for adding custom tests.

“Using the scanner does not eliminate the need for manual testing or automated tests — these are must-haves for delivering quality products. But GCSXScanner can speed up the development process by showing issues in app during development,” Janga wrote.

Its debut comes a day after Google released Lookout, an app for Pixel smartphones that uses the same underlying computer vision technology as Google Lens to help visually impaired users “see” by pointing their phone at objects, contributing to the Mountain View company’s growing library of accessibility apps. In January, it launched two — Live Transcribe and Sound Amplifier — that use machine learning algorithms to transcribe speech and amplify sounds. And last year, it debuted Voice Access, an app that substitutes touchscreen tap interactions with voice equivalents.

More broadly, Google has made a concerted effort to improve the Android experience for deaf and hard-of-hearing users. In August 2018, it published a new open specification intended to kick-start the development of hearing aids that work flawlessly on Android phones across Bluetooth low energy (LE), replete with low latency and minimal impact on battery life. And in March 2016, Google launched a toolkit — Accessibility Scanner, the precursor and complement to Accessibility Scanner for iOS — that evaluates apps and suggests ways they might be improved for visually and auditorily impaired users.


VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more
Become a member