Did you miss a session from GamesBeat Summit Next 2022? All sessions are now available for viewing in our on-demand library. Click here to start watching.
To commemorate the ninth annual Global Accessibility Awareness Day today, Google released Action Blocks, an Android app that uses Google Assistant to kick off actions with the tap of a home shortcut. Complementing this, the company released a slew of updates across AI-powered accessibility tools like Live Transcribe and Sound Amplifier.
Research shows that disabilities remain a major impediment to tech usage. Disabled people are about three times as likely as those without a disability to refrain from using the internet, according to a 2016 Pew Research Center survey. They’re also 20% less likely to subscribe to home broadband or own a computer, smartphone, or tablet. Google asserts that AI has a role to play in rectifying this, and it’s not the only one — companies like Microsoft and Amazon have also invested heavily in AI for accessibility.
Action Blocks — which works on Android 5.0 and above in English but not for children’s Google Accounts just yet — can be linked to any corresponding Google Assistant action. In fact, they can be configured to do anything Assistant can do, including (but not limited to) queuing up a show, controlling connected lights, ordering a rideshare, or calling a family member.
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
In this respect, Action Blocks are akin to Apple’s Siri Shortcuts, an iOS feature that lets users quickly perform preprogrammed tasks with a tap or by asking Siri. As Google explains: “Built with the growing number of people with age-related conditions and cognitive differences in mind, Action Blocks can also be used for people with learning differences, or even for adults who want a very simple way to access routine actions on their phones. Action Blocks may also be useful for anyone who could benefit from an easier way to perform routine actions on their device.”
This week, Google also updated Live Transcribe, which provides real-time speech-to-text transcriptions of conversations for people who are deaf or hard of hearing. Live Transcribe can now set a user’s phone to vibrate when someone nearby says their name; accept custom names or terms for different places and objects not commonly found in the dictionary; and search across three days’ worth of conversations stored locally on-device. Additionally, the app’s language support has expanded to Albanian, Burmese, Estonian, Macedonian, Mongolian, Punjabi, and Uzbek.
In related news, Google’s Sound Amplifier, a tool that clarifies the sound around users, now works with Bluetooth headphones and can boost the audio from media playing on Pixel devices. Chrome’s Get Image Descriptions feature, which uses AI to describe unlabeled images on the web, now understands French, German, Hindi, Italian, and Spanish. Voice Access on Android recognizes commands like “zoom in,” “magnify,” “pan left,” and “go right,” with a new grid view that lets people navigate their phones more easily. And in Google Maps and Google Search, it’s now easier to find accessibility information about over 15 million places in Australia, Japan, the UK, and the U.S.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.