Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Ever spend entirely too much time trying to find a pivotal scene in a video clip you watched ages ago? We’ve all been there. While transcripts ease the burden somewhat, footage isn’t exactly skimmable like text. Until now.

Today, Google announced the rollout of a new feature — Key Moments — designed to surface shortcuts to video highlights. The next time you perform a search in Google Search for how-to videos, speeches, or documentaries, you’ll see links based on timestamps provided by content creators.

Google notes that for people who rely on screen-reading software to navigate the web, the new feature also makes video content more accessible. “Just like we’ve worked to make other types of information more easily accessible, we’re developing new ways to understand and organize video content in Search to make it more useful for you,” wrote Google Search product manager Prashant Baheti. “[Now,] you can find key moments within videos and get to the information you’re looking for faster, with help from content creators.”

Google Key Moments


MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Key Moments will initially appear in English for YouTube videos from contributors who’ve supplied the abovementioned timestamps. It’s currently limited to a small set of providers, but interested parties can sign up for early access. Google is also piloting an option for videos hosted elsewhere that uses structured data to denote important video segments.

“Soon, you’ll be able to find these key moments from video publishers around the world, such as CBS Sports and NDTV, as they add markup to their videos, and we look forward to more creators adopting this helpful new feature,” added Baheti.

Key Moments comes weeks after Search gained a nifty movie and TV show curation carousel. Earlier this summer, Google Images on mobile gained an upward swipe gesture that instantly opens webpages, and its desktop counterpart got a fresh coat of paint intended to make visualizing collections of apparel, flora, home goods, and hairstyles easier than before. In September, Images began showing captions with web page titles and related search terms, and the search algorithms underpinning it were retooled to prioritize webpage authority, fresh content, and image placement. Separately, Google brought Google Lens, its visual search tool, to Images on the web.

Within Search, Google earlier this year rolled out a series of enhancements to AMP Stories, an open source library that enables publishers to build web-based, Snapchat-like flipbooks with slick graphics, animations, videos, and streaming audio. This summer saw the launch of a “dedicated placement” in Search for Stories in specific categories, like travel, along with components that let creators embed interactive content. (Categories like gaming, fashion, recipes, movies, and TV shows will come later in the year.) In a somewhat related development, late last year Google said it would begin using AI to generate Stories about celebrities, athletes, and other “notable people” and surface them in search results.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.