Ever since Apple’s Siri heralded the age of intelligent assistants (IAs) four years ago — followed by Microsoft, Google, Microsoft, and Facebook — pundits have complained that intelligent assistant technology isn’t living up to its promise.
The truth is that innovation in this domain, as in all technological domains, follows a predictable cycle and goes far beyond the big incumbents to include niche players (such as Nokia’s Here, the intelligent mapping and navigation specialist that BMW, Audi, and Daimler bought for $ 3.1 billion several months ago). Tracking that innovation when hundreds of companies are involved is very challenging. But we’ve done our best to accomplish that with the VBProfiles Intelligent Assistance Landscape.
We’re continuing to watch the space and update our report as the landscape changes, but here are some of our key conclusions to date:
Intelligent assistance is maturing, albeit slowly
In 2015, the IA space introduced new features, functions, and integrations at an accelerated rate. As the general public becomes more comfortable communicating with machines using their own words, demand is growing. On the supply side, technology providers benefit from “the API economy.” Well-defined interfaces and integration points make it possible for today’s IAs to add new tasks and capabilities to their repertoire quickly and efficiently. For example, Alexa, the IA for Amazon’s Echo, has moved beyond reading e-books aloud and can now order pizza and summon Uber.
Apple’s Siri, which is decidedly less open, still allows its users to make posts to Facebook, book tables at nearby restaurants, play podcasts, and make appointments for you. Google’s unnamed IA is able to provide turn-by-turn directions. Microsoft’s Cortana adds travel times for planning purposes. And now Facebook’s message-based M can recommend clothing stores and new fashions and share all this with your friends.
Yet, as the picture above shows, we are at the beginning of an evolutionary journey. Many of the companies building the technologies in this space are still young, as is shown by the latest early stage funding events. For their part, Enterprise IAs that support digital commerce are a clear example of where we’re at in the maturity path. While they are capable of delighting customers by quickly understanding or predicting their intent, eight out of 10 implementations simply answer questions or help navigate visitors to the right page on a Web site. The next step will be to bring much more data to bear, in real time, in order to provide advice and complete tasks. Following that, we expect to see IAs that are truly conversational and transactional. Understanding is followed by machine learning and ingestion of “personal” information – such as payment preferences, favorite brands, emotional state, past activity – that inform responses and assist in completing tasks.
Innovation in IA
Here are the developments we saw in 2015:
- The proliferation of bots: New bots appeared almost weekly. One example is Evia, which can help individuals buy car insurance based on a picture of their license plate.
- Mass acceptance of natural language processing: We no longer bark commands or search terms, instead we ask questions and speak in full sentences.
- Migration to messaging: WhatsApp, WeChat, Facebook Messenger, and their peers are adding bots and becoming e-commerce ecosystems.
- Emergence of emotion detection: Companies like Emobase and Heartbeat Technologies are sorting out how to recognize and respond to our emotional “tells.”
And here’s what we’ll see in 2016:
- Intelligent Authentication (IAuth) promotes trust: IAuth will make it possible for an IA to have strong confidence that the person it is conversing with is who he or she claims to be. It will also help IAs maintain context and be better able to take turns in a conversation.
- Growth of conversational commerce: Today’s IAs customarily answer questions and then move on. Future IAs will know that, because you asked about nearby restaurants, there’s a good chance you will want to book a table. And, as long as you are going to book a table, perhaps you want share the experience with your friends and family.
- Empathetic IAs: Speech analytics can already detect when an individual is angry and ready to change vendors, but a new generation of solutions providers promote emotion detection that is responsive to a broader array of feelings.
- Ubiquitous IA: Amazon’s Echo has open APIs that keep adding capabilities. Today most of them are in the kitchen but many have moved to the family room to rule the TV. Consistent, conversational interfaces to cars and public kiosks are next.
To achieve IA’s true potential will require more new entrants into the landscape. They will provide technologies that go beyond accurate speech recognition, natural language understanding, and machine learning. This action will be driven by firms that address the challenges of aggregating and ingesting data from a multiplicity of sources (knowledge management) and that include pattern recognition and analytics to support real-time conversations between people and machines.
It is going to take roughly 10 years for this IA vision to be fully realized. By that time, every “thing” in the Internet of things will be able to understand what we say. We, ourselves, will be totally comfortable as we carry on conversations with devices and objects far beyond the smartphones, tablets, and PCs that we currently regard as “personal” devices. Add your car, your TV, and those cylindrical speakers that have taken their place in your kitchen or family room.
Of course, this IA nirvana may take place well ahead of the 10-year forecast. Almost daily we will witness instances where machines outperform humans at our own games. For instance, just three weeks ago, Google’s parent company Alphabet showed that its DeepMind unit’s AlphaGo could beat the European Go champion five games to zero. That had been an unthinkable accomplishment just days earlier. The technologies are here today. Productized versions are on the horizon. There’s work to do and there are problems to solve. Overall, it falls to us (humans) to keep development and deployment on schedule.
Dan Miller is founder and lead analyst at Opus Research, a market research firm focused on Intelligent Assistance. You can track his Intelligent Assistance Landscape on VBProfiles.com.