Google today released the second-generation version of Pixel Buds, earbuds with a dedicated chip for on-device machine learning that gets rid of the string that held the first-generation earbuds together. The device is made to allow environmental ambient sound in and uses adaptive sound to adjust volume based on your environment. This means it may go up on the subway and down when you’re walking home, for example.
The new Pixel Buds go on sale in spring 2020 for $179.
Pixel Buds debuted today alongside the latest version of Pixelbook Go and the anticipated release of Google’s flagship smartphone Pixel 4 at a Made by Google hardware event held in New York City.
Unlike other devices introduced at the event, little was known about Pixel Buds before they were announced onstage at The Shed, a performing arts center in the city.
The latest version of Google’s Pixel Buds will offer hands-free access to Google Assistant, so you just have to say “Hey Google” to start a podcast, send a text message, or translate a language.
The device also has a long-range Bluetooth connection to keep you connected even when your phone isn’t by your side, meaning you can leave it in a locker at the gym or put it down at work or home. The connection works across three rooms indoors and nearly 100 yards outdoors. The new Pixel Buds get 5 hours of continuous listening time on a single charge and up to 24 hours when using a wireless charging case.
The first-generation Pixel Buds made their debut just two years ago, and the Pixel Buds reboot is a welcome one, as the initial pair were not super popular. While Google demonstrated the Babel Fish tech we’ve all dreamed of onstage two years ago, our tests found the translator capable of translating snippets of speech no longer than 10 seconds.
Earbuds connected with Google Assistant on a smartphone can do things like record reminders and deliver navigation instructions. That’s the basic functionality afforded by any pair of headphones plugged into an Android smartphone, but headphones with Google Assistant integration can also do things like read your SMS messages or Facebook Messenger notifications and let you dictate answers back.
For people with a need or desire to wear headphones on a near-constant basis, hands-free “Hey Google” control can lead to efficiency gains and lower a barrier to AI assistant adoption rates.