Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

Last December during its re:Invent 2019 conference in Las Vegas, Amazon unveiled Contact Lens, a virtual call center product for Amazon Connect that transcribes calls while simultaneously assessing them. After a monthslong preview, Contact Lens today launched in general availability in the US East (N. Virginia), US West (Oregon), EU (Frankfurt), EU (London), Asia Pacific (Singapore), Asia Pacific (Sydney), and Asia Pacific (Tokyo) Amazon Web Services (AWS) regions, with rollouts in additional regions to come later this year.

As customer representatives are increasingly ordered to work from home in Manila, the U.S., and elsewhere, companies including John Hancock, Capital One, Intuit, GE, Square, Fujitsu, and Dow Jones are turning to AI solutions like Contact Lens to bridge gaps in service. The solutions aren’t perfect — there’s always going to be a need for human teams, even where chatbots are deployed — but COVID-19 has accelerated the need for AI-powered contact center messaging.

Contact Lens, which Amazon says is based on the same technology that powers its own customer service centers, is a fully managed set of capabilities enabled by AI and machine learning. With it, companies can ostensibly understand the sentiment, trends, and compliance of customer conversations, discovering emerging themes while conducting full-text search on call transcripts. Supervisors can use Contact Lens to view agents’ performance with detailed analytics, and in late 2020, the service will optionally alert supervisors to issues during in-progress calls, giving them the opportunity to intervene when a customer might be having a poor experience.

Contact Lens leverages deep learning to make it easier for supervisors to search voice interactions based on call content and conversation characteristics like talk speed, long pauses, and customer and agent interruptions. By clicking on the search results, supervisors can view a contact detail page to see the call transcript, customer and agent sentiment, and a visual illustration of conversation characteristics. And thanks to natural language processing technology that highlights potentially problematic transcript words and phrases, they can uncover issues via Contact Lens that indicate reasons for customer outreach.


Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

Furthermore, Contact Lens lets supervisors monitor agents’ interactions for customer experience, regulatory compliance, and adherence to script guidelines by defining custom categories within Amazon Connect that organize contacts based on words or phrases. Contact Lens also includes AI capabilities to automatically detect and redact sensitive personally identifiable information (PII) like names, addresses, and Social Security numbers from call recordings and transcripts to help customers more easily protect their private data.

In the coming months, alongside the alerting feature, Amazon says Contact Lens will introduce a dashboard showing the sentiment progression of live calls. This dashboard will continuously update as the interactions evolve and allow supervisors to look across live calls to spot opportunities to help customers.

Contact Lens provides metadata such as transcriptions, sentiment, and categorization tags in Amazon Simple Storage Service (Amazon S3) buckets in a well-defined schema, Amazon notes. If they so choose, businesses can export this information and use tools like Amazon QuickSight or Tableau to perform further analysis and combine it with data from other sources.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.