Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
At Transform 2021, Sarah Pearson, AI/ML for DoD spoke with Jonathan Rosenberg, CTO and Head of AI at Five9, Christian Kitchell, AI solutions executive & head of Erica at the Bank of America, and Barak Turovsky, Head of Product NLP at Google AI, about recent advances in conversational technology, and how NLP is increasingly being used in the real world.
While conversational AI technologies enable us to have human-like interactions with machines, technologists are only recently beginning to make those human-like interactions more human.
“Machines are still pretty dumb,” said Turovsky, “and it takes a lot of techniques to basically teach machines a lot of things that humans take for granted.”
That’s because much of the way humans talk can be very open-ended and ambiguous. If you say, “I’m learning Go,” are you referring to learning a programming language or a game? If you say, “They were out,” did you mean that the neighbors weren’t at home or the store was out of what you went there to buy?
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Context, Turovsky explained, is the most important factor in discerning meaning, and simply put, machines have limited context, and in many cases, only analyze one sentence at a time.
To address this, Google has developed BERT (Bidirectional Encoder Representations from Transformers) — a mouthful that really is a machine-learning technique that trains the language processing model by ‘hiding’ about 20% of the input words.
“We train the model to guess the missing words,” explained Turovsky. “It’s kind of a game. When you train BERT in this way, it can actually predict the missing words and text, but it’s actually not that useful on its own, just by predicting it. However, it turns out, that when you train a model in this way, in a very large set of data, billions and billions of words of text, the model learns an enormous amount of language understanding in general and world knowledge.”
Turning to real-world applications, conversational AI came into the spotlight when COVID took hold and the lockdown came into effect, Kitchell said. Consumers at home were grappling with some unprecedented financial challenges. They were contacting call centers, like Bank of America’s, with questions and help for tackling income issues and applying for government aid.
Kitchell and team trained Erica, the BoA virtual financial assistant, to understand over 60,000 COVID-related terms and questions, and field the first wave of customer queries.
“We had an unbelievable amount of volume coming into our call center,” Kitchell says, “so by being able to handle those more straightforward tasks with Erica, we freed up a lot of the time for our specialists to focus on more complex client needs.”
They were able to help about 600,000 consumer and small business clients apply for Paycheck Protection Program loans, and defer payments or obligations on mortgages, credit cards, and other financial obligations, and stay up to date on the current status of government payments, when they should be expected, and how to most effectively and efficiently get that cash into their accounts.
“I think it was a really good example of how the nimble and flexible nature of conversational AI can be mobilized very quickly and brought to bear to help clients when they need you most,” he said.
One of Five9’s clients, a COVID clinic, had a similar urgent need for customer service backup, said Rosenberg. It had set up a contact center to take questions from people who wanted to know about COVID testing, especially in rural areas. Their agents were answering calls about where to get tested, and how, and when appointments would be available.
“As you can imagine, this COVID clinic could not staff agents to handle the call volume fast enough,” Rosenberg said. “It was just impossible. So this was a pretty important use case for AI here.”
They clinic came to Five9, which set up a voice-based conversational AI that was able to answer queries, schedule callbacks and consultations, and offer information about the nearest COVID testing facilities. Unlike previous generation technologies, their model was good at recognizing street names and cities, and fielding complex questions. Because of the urgency of the request, Five9 worked to deploy their platform in 10 days, Rosenberg said.
“As a result, the clinic was able to immediately handle huge volumes of calls and get people tested, which is really critical,” said Rosenberg. “Here’s an example where conversation and technology might have actually saved some people’s lives by getting them a test set up, or quickly connecting them to someplace where they could get a test and prevent someone else getting infected.”
Conversational AI can also come into play in fraught situations, where callers are frazzled or frustrated, and want to speak to a person, because they have more than just a simple query. That’s where agent assist technologies, where the bot is actually listening in on the conversation between the human and the agent, can come into play, offering the the agent information and giving them guidance as the call unfolds.
So for example, in the use case where a customer calls a COVID information number to ask where the nearest clinic is, the bot can immediately surface local testing centers for the agent, and the caller doesn’t have to wait on hold while the agent types in a query.
“AI makes those answers faster, quicker, more efficient, allowing the agents to focus on empathy and human interaction,” he said
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.