Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Google is rolling out a handful of new AI-powered features for G Suite users around the world, spanning Docs, Google Assistant, Calendar, and more.
The announcements were made as part of the annual Google Cloud Next conference in London, which has already heralded some interesting news tidbits this week, including a bunch of new security smarts for Google Cloud Platform.
What’s up Docs
Yesterday, Google quietly announced that its popular Smart Compose feature will soon be expanding beyond Gmail to Google Docs. This news ties into two other new features coming up for Google Docs for G Suite subscribers.
Way back in July of last year, Google teased a new feature that would bring inline grammar and spelling suggestions to Google Docs, an upgrade that started rolling out to G Suite users in February. A similar feature was added to the business edition of Gmail recently. Now Google is turbocharging grammar suggestions in Docs by using neural network smarts, similar to the way Google builds its machine translation tools. In effect, this method involves feeding the tool millions of text-based examples that teach it common patterns for sentence construction.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Additionally, Google announced that Docs will soon offer spelling autocorrect powered by Google Search. This means suggestions will be informed by the latest vernacular, as it feeds on data from billions of searches from real people.
For businesses, however, the power of the crowd won’t always suffice because companies and industries often develop their own lingo, such as project-specific acronyms, unique team names and job titles, technical jargon, and so on. That’s why Google Docs will soon suggest spelling corrections that include words from a specific business domain. Along with more tailored recommendations, documents should soon contain fewer underlined words that have been flagged for human inspection.
Google Assistant for G Suite
Google Assistant has come a long way since its debut more than three years ago, having made its way into all manner of applications and devices. But the voice-activated virtual helper took some time to reach the business realm, finally kicking off its arrival with a G Suite integration for Google Calendar last April.
Up until now, you could ask Google Assistant natural language questions such as “Where’s tomorrow’s meeting taking place?” or “When’s my next appointment?,” but it hasn’t lived up to its full potential. Moving forward, users will be able to do much more with Google Assistant and Calendar as part of G Suite — they will now be able to create, cancel, and reschedule meetings, and even block out their calendar to avoid having meetings scheduled at unsuitable times.
Beyond that, G Suite users will now be able to dial into meetings and email all the attendees to tell them that they’re running late, all via Google Assistant. Bear in mind that this supports contextual natural language queries, meaning you can simply say “Hey Google, join my next meeting” and Google Assistant will dial you into the call automatically from your phone. It’s worth noting that this won’t yet work with Hangouts Meet or any other VoIP app — just your phone.
All of these new features are available on an invite-only basis as part of the beta phase.
Companies that have invested $2,000 in one of the Hangouts Meet hardware kits developed by Asus will be able to access more intricate Google Assistant features as part of a meeting room setup. This will include joining group video calls with an “Okay Google, join the meeting” command, or even leaving a meeting or making a phone call.
It’s also worth noting that companies may not be comfortable using some of the voice-activated functionality during the beta phase, particularly if their calls contain highly sensitive content. Tech companies such as Google and Apple have hit the news recently after it emerged that humans were manually reviewing audio data gathered by voice assistants to improve their respective systems. Both Google and Apple eventually had to halt human voice-data reviews, and Amazon soon after introduced a feature that allowed users to disable human review altogether in Alexa.
In the beta signup page for the Hangouts Meet/Google Assistant integration, Google confirms that it needs to collect some data to enable and improve this functionality:
In order to use the feature and help develop the Test Product, we need to collect data, including transcripts of voice commands and data collected from employees’ usage of Google Assistant enabled Meet hardware devices.
Transcripts of voice commands will largely focus on the short period around “hot words” (e.g. “Hey Google“) that Google listens out for to perform an action — the company notes that it may gather transcripts for a few seconds after an action word is detected. This may also be “reviewed manually” (i.e. read by a human) during the testing phase.
It is worth stressing that the T&Cs are specific to the testing period and it’s not clear how this will change once the beta phase is over.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.