Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.

Google AI today shared that it’s created a model for detecting an endangered species of orca whales in the Salish Sea, a waterway between the United States and Canada. Underwater microphones situated at a dozen points in the Salish Sea that includes the state of Washington and Vancouver Bay are used to alert officials when a Southern Resident killer whale is detected.

Less than 100 of these whales are thought to still be alive, according to the Center for Whale Research.

The orca detection model is the latest from Google, and it follows previous acoustic AI work to detect the sound of chainsaws in rainforests to stop illegal lumber operations and work last year with the National Oceanic and Atmospheric Administration (NOAA) in the U.S. to help protect humpback whales.


Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

The orca model runs on a platform operated by the nonprofit Rainforest Connection. Detection alerts are sent to the smartphone of Department of Fisheries and Oceans (DFO) officials in Canada, who can dispatch the Canadian Coast Guard to clear boat traffic in Vancouver Bay.

The DFO provided 1,800 hours of underwater audio recordings with 68,000 labels to train the model. Notifications also play an audio recording of sounds detected by a deep neural network so human experts can verify the prediction and make their own predictions about the whale’s current state of health.

Additional work is ongoing to better discern between the Southern Resident and other orca species and understand when specific whale sounds are associated with health problems, Google AI engineer Matt Harvey told VentureBeat.

The company shared the news today at a Google AI event in its San Francisco office. In addition to real-time whale tracking, Google AI announced that it can now detect signs of anemia in people from retinal eye scans. This is the latest work from Google that uses computer vision to find patterns in eye scans, following work to identify diabetic retinopathy and a range of eye diseases.

Google AI shared plans to bring transcriptions to Google Translate for long-form interpretations akin to the kind of speech-to-text transcriptions Pixel 4’s Recorder app now provides. No release has been set, a company spokesperson told VentureBeat.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.