We're thrilled to announce the return of GamesBeat Next, hosted in San Francisco this October, where we will explore the theme of "Playing the Edge." Apply to speak here and learn more about sponsorship opportunities here. At the event, we will also announce 25 top game startups as the 2024 Game Changers. Apply or nominate today!
During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.
Google says users will soon be able to see how busy places are in Google Maps without searching for specific beaches, grocery stores, pharmacies, or other locations, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to businesses’ profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass shields, and more.
An algorithmic improvement to “Did you mean?” — Google’s spell-checking feature for Search — will enable more accurate and precise spelling suggestions. Google says the new underlying language model contains 680 million parameters (the variables that determine each prediction) and runs in less than three milliseconds. “This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Google head of search Prabhakar Raghavan said in a blog post.
Google says it can now index individual passages from webpages, as opposed to whole pages. When this rolls out fully, Google claims it will improve roughly 7% of search queries across all languages. A complementary AI component will help Search capture the nuances of webpage content, ostensibly leading to a wider range of results for search queries.
Event
VB Transform 2023 On-Demand
Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.
“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”
Google is also bringing Data Commons — its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities — to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.
On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other auto brands, for example, smartphone users can zoom in to view the vehicle’s steering wheel and other details — to scale. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffled sleeves, and more.

Above: Augmented reality previews in Google Search.
In another addition to Search, Google says it will deploy a feature that highlights notable points in videos — for example, a screenshot comparing different products or a key step in a recipe. Google expects 10% of searches will use this technology by the end of 2020. And Live View in Maps, a tool that taps AR to provide turn-by-turn walking directions, will enable users to quickly see information about restaurants, including how busy they tend to be and their star ratings.
Lastly, Google says it will let users search for songs by simply humming or whistling melodies, initially in English on iOS and in more than 20 languages on Android. You will be able to launch the feature by opening the latest version of the Google app or Search widget, tapping the mic icon, and saying “What’s this song?” or selecting the “Search a song” button, followed by at least 10 to 15 seconds of humming or whistling.
“After you’re finished humming, our machine learning algorithm helps identify potential song matches,” Google wrote in a blog post. “We’ll show you the most likely options based on the tune. Then you can select the best match and explore information on the song and artist, view any accompanying music videos, or listen to the song on your favorite music app, find the lyrics, read analysis, and even check out other recordings of the song when available.”
Google says melodies hummed in Search are transformed by machine learning algorithms into number-based sequences. The models are trained to identify songs based on a variety of sources, including humans singing, whistling, or humming, as well as studio recordings. The algorithms also abstract away all the other details, like accompanying instruments and the voice’s timbre and tone. What remains is a fingerprint Google compares with thousands of songs from around the world to identify potential matches in real time, much like the Pixel’s Now Playing feature.
“From new technologies to new opportunities, I’m really excited about the future of search and all of the ways that it can help us make sense of the world,” Raghavan said.
Last month, Google announced it will begin showing quick facts related to photos in Google Images, enabled by AI. Starting in the U.S. in English, users who search for images on mobile might see information from Google’s Knowledge Graph — Google’s database of billions of facts — including people, places, or things germane to specific pictures.
Google also recently revealed it is using AI and machine learning techniques to more quickly detect breaking news around natural disasters and other crises. In a related development, Google said it has launched an update using language models to improve matching between news stories and available fact-checking sources.
In 2019, Google shared its efforts to solve query ambiguities with a technique called Bidirectional Encoder Representations from Transformers (BERT). BERT, which emerged from the tech giant’s research on Transformers, forces models to consider the context of a word by looking at the words that come before and after it. According to Google, BERT helped Google Search better understand 10% of English queries in the U.S. — particularly longer, more conversational searches where prepositions like “for” and “to” inform the meaning.
BERT is now used in every English search, Google says, and it’s deployed across a range of languages, including Spanish, Portuguese, Hindi, Arabic, and German.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.