By 2020, we will all have an Invisible Friend. Whether we call it Siri, Alexa, OK Google, or a chatbot, we are entering a world where an intelligence assistant recognizes our “intent.” This could spawn a massive consumer behavior shift, as AI-influenced bots would mean far fewer Google searches by humans.
This invisible friend would learn from its mistakes, maintain context, and continue to expand into new areas of expertise through judicious use of Knowledge Management (see below landscape).
Although 2020 is our destination, now is a time of heightened activity among the companies that provide the elements of Intelligent Assistance. That activity is reflected in the Intelligent Assistance landscape that accompanies this update.
A high-resolution version of this landscape is available here.
Put it in writing (or “text,” rather)
Roughly 90 percent of a person’s time on a smartphone is interfacing with text. The Messenger/Telegrams of the world have surpassed social networking and anything else we do online. This is a massive transformation, and more powerful than “OK Google,” at least at this stage.
Indeed, even though a good deal of Natural Language Processing smarts came out of speech-enabled interactive voice response (IVR) platforms, man’s search for meaning has fixated on text-based input. This becomes even more obvious when looking at the IA Landscape, where the Text/IO market features 38 companies, versus only 14 in Speech I/O.
The social networking and search giants, Facebook and Google, have conditioned hundreds of millions of people to communicate digitally using their (ahem) digits. Virtual chat providers like 7, Artificial Solutions, CreativeVirtual, NextIT, and Nuance Communications (via Nina Web) have risen to the challenge via dialogue boxes on websites that feature virtual agents. And now messaging platforms like Slack, WeChat, Facebook Messenger, and even Twitter are poised to be fertile ground for more IAs.
On messaging platforms we call them “bots”
Efforts to make messaging apps support conversational commerce have begotten hordes of ‘bots. In this context, the term refers to resources or programs that monitor our conversations and chime in with suggestions (“I see you need a hotel in Chicago, shall I book it for you?).
You’ll see them vie for turf in the Personal Assistant, Personal Advisor, and Employee Assistant territories of the IA landscape. Thanks to the availability of software development kits and “open” APIs, dozens of ‘bots are born each day, and Opus Research will not be tracking all of them. But we will pay special attention to those that have depth, staying power, and demonstrate true utility by providing individuals with highly personalized, private control over the devices and services they use every day.
Early candidates to monitor are those involved with scheduling, project management, travel, routine purchasing and even the classic joke of the day. X.ai has already made a name for itself, “Amy Ingram.” But there are many other ‘bots to deal with, like Meekan for calendaring, Howdy.ai for automating common tasks, and Conversica for sales support.
Building on the broad shoulders of knowledge management
Knowledge management systems stand out as the earliest examples of executive efforts to make sense out of all the unstructured data and metadata that businesses (and government agencies) compile. In support of programs to optimize customer journeys, to maximize First Call Resolution and Net Promoter Scores, or to minimize Customer Effort, executives look to systems that can make sense of CRM history, call recordings, call detail records (CDRs), and all manner of activity.
A new cast of characters in speech processing
As Amazon’s Alexa has made dramatically evident, the need to talk to connected devices on the Internet of Things makes voice more important than ever before. This may seem to contradict my first observation regarding Text I/O, but the truth is that you have to support both, and you have to do it in ways that feel easy (and even elegant) to the individuals commanding them.
The big news of the past quarter was Google “opening up its Speech APIs,” ostensibly so that a broad spectrum of developers can speech-enable their applications and services. Microsoft did much the same by formally changing the name of Project Oxford (“projects” are seldom “products”) to Microsoft Cognitive Services. All appear to be efforts to commoditize both speech recognition and natural language understanding. In this context, “commodity” means “ubiquity,” and that bodes well for the large, incumbent solutions providers like IBM, Nuance, Google, and Microsoft. They all stand to benefit by selling more products and services when people ask for solutions that understand the words they say (in addition to text).
Thanks to all of the conversational Intelligent Assistants out there, having an Invisible Friend will no longer be stigmatized. Instead, we’ve set the stage for virtual assistants to become the preferred mechanisms for taking control of our digital lives. We’ll see them evolve from Assistants to Advisors and beyond, to Advocates that understand what we’re trying to accomplish online or on messaging platforms. We will judge them by the range and depth of services they support and by their ability to recognize and serve our personal intent while protecting our personal privacy.
Dan Miller is founder and lead analyst at Opus Research, a market research firm focused on Intelligent Assistance. You can track his Intelligent Assistance Landscape on VBProfiles.com.