Did you miss a session from GamesBeat Summit 2022? All sessions are available to stream now. Learn more.
Artificial intelligence (AI) underlies many, if not most, of Google Search’s features. It’s how Google News personalizes podcast, video, and article recommendations, and how Google’s lightweight Search app for Android Go reads the text of websites in more than two dozen languages. And the Mountain View company is intent on expanding its influence.
At an event today commemorating Google’s 20th anniversary, it took the wraps off of three new AI enhancements bound for Search: auto-generated “immersive content,” video previews, and improved image search.
“When Search first began, our results were just plain text,” Cathy Edwards, director of engineering at Google Images, said in a blog post. “Today, we’re introducing … fundamental shifts in how we think about Search, including a range of new features that use AI to make your search experience more visual and enjoyable.”
The first of these concerns AMP Stories, Google’s open source library that enables publishers to build web-based, Snapchat-like flipbooks with slick graphics, animations, videos, and streaming audio. (Google debuted AMP Stories in February with eight publishers, including Vox Media, Conde Nast, and The Washington Post.) The company said it’s leveraging AI to “construct” AMP Stories and surface them in search results, starting first with stories about celebrities, athletes, and other “notable people.”
On the video front, Google is leveraging computer vision to “deeply understand” the content of videos and highlight them in Search. Featured videos, as it calls them, will semantically link to subtopics of searches in addition to top-level content.
“For Zion National Park, you might see a video for each attraction, like Angels Landing or the Narrows,” Edwards wrote. “This provides a more holistic view of the video content available for a topic, and opens up new paths to discover more.”
On a somewhat related note, Google Images is getting a fresh coat of paint. The firm said it has retooled the Images algorithm with a greater emphasis on web page authority, the freshness of content, and image placement. (Sites where photos are “central to the page” and “higher up” on the page will get priority.) Also, starting this week, Google Images will begin to show captions with web page titles and related search terms.
Last but not least, Google announced that Google Lens, its visual search tool, will migrate from Android and iOS to the web. Like the Google Lens app in Google Photos and the Google Search app for iOS, Lens in Google Images will analyze and detect objects — including but not limited to pets, landmarks, furniture, clothing, barcodes, and artwork — in snapshots and show relevant images. (It’ll let you dive deeper by drawing on parts of the pic that Lens didn’t preselect.)
“We hope these changes will make it easier — and more visually interesting — to traverse the web, find information, and pursue your interests,” Edwards said.
Google’s AI Search enhancements were unveiled alongside Activity Card, a new tab above mobile results that shows related and previous searches you’ve performed. Another new feature, Collections, lets you save searches around topics.
Today also saw the launch of Discover, a stream of contextual topic headings and cards on the Google homepage; AI-powered tools designed to assist in emergencies, including a flood prediction model (which will roll out first in India); and Pathways, a job search tool that recommends listings and training programs (it’s launching in Virginia).
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.