Every day, individuals call into customer contact centers and provide sensitive information, like credit card numbers, to agents by voice. Now, a conversational artificial intelligence (AI) solution using natural language understanding capabilities offers a way to remove that information from calls, while still passing data through for transactions.
This is important because dealing with any sort of personally identifiable information (PII) inevitably involves an array of compliance with security and privacy regulations that can vary based on jurisdiction. There is also a non-trivial risk that sensitive information could potentially be leaked or stolen. In fact, there are known incidents where credit card information provided by voice have been written down by malicious agents, leading to undesirable outcomes.
“There was an incident where an enterprise customer came to us with a real-life story saying, hey, look, this happened, somebody noted down the credit card numbers and those things were leaked in the open market,” Srini Bangalore, head of AI research at conversational AI vendor Interactions, told VentureBeat. “That led us to start thinking about the technology itself, and how to go about redacting personally identifiable information in real-time with low latency, without impacting user experience.”
To that end, Interactions developed a new technology, Trustera, which is generally available as of today. The goal is to use AI and machine learning (ML) techniques to identify PII in real time, redact it from the live voice call and still pass off the information to the underlying digital systems for transactions in an encrypted approach.
Taking a hybrid AI approach to conversational AI
Interactions is a company that designs conversational AI technology platforms for organizations.
Conversational AI technology is commonly associated with human interactions with bots, but that’s not the approach that Interactions has largely taken. Bangalore said that his company has taken what he called a hybrid AI approach.
With the hybrid AI model, humans are part of the process alongside conversational AI to help support user experience in a frictionless approach. The Trustera system, for example, is not bot-driven, but is intended to operate in environments where an individual calls into a customer support center and then speaks with a human.
Bangalore said that the process of redacting PII in human led conversations is more complicated than it is for purely bot and digital-driven interactions in an interactive voice response (IVR) type system. He noted that in IVR or bot conversations, the system knows when PII is being transmitted because it is part of the process and initiated by the system.
With human-led conversations, it’s not always at the same point in a conversation when PII is requested or transferred. There is also a need to understand what PII is being sent, as well as understanding the actual human speaker.
How Trustera conversational AI works to secure PII
The AI technology that Interactions has developed for its conversational AI platforms has its roots in capabilities that come from AT&T Bell Labs.
In 2014, Interactions acquired speech analysis technologies from AT&T, which is where Bangalore had formerly worked for 18 years. The speech recognition capabilities have steadily improved in the years since, with the integration of natural language understanding (NLU) functionality, which helps to enable the Trustera service.
Interactions has trained its model on-call information to understand when different human speakers transfer PII. The model isn’t static and is constantly being updated.
“We have a self supervised auto ML approach, where we take the previous day’s calls and we have a notional confidence metric to say these are data elements that we can add back to the model,” Bangalore said. “So we update the model periodically that way as well.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.