We're thrilled to announce the return of GamesBeat Next, hosted in San Francisco this October, where we will explore the theme of "Playing the Edge." Apply to speak here and learn more about sponsorship opportunities here. At the event, we will also announce 25 top game startups as the 2024 Game Changers. Apply or nominate today!

At its I/O 2018 developer conference, Google launched Lookout, an Android app that taps AI to help blind and visually impaired users navigate with auditory cues as they encounter objects, text, and people within range. By keeping their smartphone pointed forward with the rear camera unobscured, users can leverage Lookout to detect and identify items in a scene.

Lookout was previously only available in the U.S. in English, but today — to mark its global debut and newfound support for any device with 2GB of RAM running Android 6.0 or newer — Google is adding support for four more languages (French, Italian, German, and Spanish) and expanding compatibility from Pixel smartphones to additional devices. The company is also rolling out a new design to simplify the process of switching between different modes.

Tasks folks take for granted can be a challenge for the estimated 2.2 billion people around the world with visual impairments, who might not notice a maintenance flyer pinned to their building’s window or could struggle to pick out ingredients in an unfamiliar kitchen.

Google Lookout


VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.


Register Now

Lookout aims to lower the usability barrier through on-device computer vision algorithms and an audio stream. Google accessibility engineering product managers like Patrick Clary worked with low-vision testers to ensure Lookout can, for example, spot packages delivered to a storage room; a couch, table, and dishwasher in a condominium; and elevators and stairwells in highrise buildings. The Lookout team also programmed in cues to indicate the location of objects in relation to users, like “chair 3 o’clock” to warn of an obstacle to the immediate right.

The redesigned Lookout relegates the mode selector, which was previously fullscreen, to the app’s bottom row. Users can swipe between modes and optionally use a screen reader, such as Google’s own TalkBack, to identify the option they’ve selected. One new mode (Food Label) reads label patches and ads — in addition to barcodes — on products like cans of tomato soup. (According to Google, focus groups said labels are typically easier for them to find than codes on packaging.) Lookout also now gives auditory hints like “try rotating the product to the other side” when it can’t spot a barcode, label, or ad off the bat.

“Quick read” is another enhanced Lookout mode. As its name implies, it reads snippets of text from things like envelopes and coupons aloud, even in reverse orientation. A document-reading mode (Scan Document) captures lengthier text and lets users read at their own pace, use a screen-reading app, or manually copy and paste the text into a third-party app.

Google Lookout

Other quality-of-life improvements in Lookout include U.S. paper currency detection — something Google asserts is especially useful, given that paper currency lacks tactile features. Lookout can distinguish between denominations (e.g., “U.S. one-dollar bill”) and recognize bills from the front or back. As before, Lookout can identify objects within the camera’s viewfinder and verbalize what it believes them to be.

The new Lookout is available starting today in the Play Store. Google says a focus going forward is improved language support, but it isn’t ready to share any details.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.