Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.


We live in an age where words matter more than ever, and saying the wrong thing can get you into a lot of trouble.

When Google demonstrated its Duplex digital assistant in May, it wasn’t even a word that upset people. It was an “uh.” People worried that the vocal tics Duplex used to sound more human — uh, mmm — were creepy and unethical.

Possibly. That’s a topic for another article. But what we can definitely say about those tics is that they are essentially API calls.

What’s a machine doing when it says “uh-huh”? It’s using shorthand for “I understand what you just said. I’m still here. I’m listening. Please carry on.” The significance of this must not be underestimated. In a world of machine-to-machine communications, these shortcuts matter.

VB Event

The AI Impact Tour

Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!

 

Learn More

APIs have transformed the world. They are the glue that connects different systems and makes so many tasks automatable.

Today, we outsource so much to machines — think of things like event bookings, payments, information searches, etc. These tasks are now the work of bots, and we take this for granted.

That said, there are limitations. Enterprises can’t support thousands of APIs. It’s simpler if there are just a few. So the world has consolidated around Uber for taxis, Google for maps, and Twitter for telling people how much we love/hate Trump.

While this makes life easier, it also excludes any organization that doesn’t connect to these companies’ APIs. So what if there was a single API for the world, one that every company could hook into? One that doesn’t even require technical nous to use?

Well, there is. Natural language.

Humans use it all the time. If computers could too, then all the fragmentation and technical hurdles would disappear. A bot could call any human being in order to get something done. No code-based “handshake,” just words.

When Google revealed Duplex, it gave the first indication that this future is real. In a way, the Duplex conversation — in which a bot booked a hairdressing appointment — was simply an API call. But, to repeat, the API was not proprietary in any way. It was just natural language.

Now, a lot of people are freaked out by Duplex. They worry about the ethics of a human-sounding machine making a call without declaring its bot status. They wonder about the impact on small talk, manners, and other anthropological concerns.

However, let’s assume that Duplex develops into a popular and workable service. How will that change the way we do transactions? In my opinion, the impact could be profound. And it’s all because of the universality of language-as-an-API.

Here’s an example: restaurant bookings. At the moment diners can use a service like OpenTable. It’s convenient and far easier than calling individual restaurants, but it only reaches a fraction of the world’s eateries.

Something like Duplex can reach every restaurant that has a phone line — or at least, every one whose staff speaks your language.

In time, it’s easy to imagine restaurants creating their own digital assistants to handle bot calls. If they do, millions of day-to-day transactions will be carried out between bots. But here’s the crucial part: They will talk to each other in an audible, spoken language. This is pretty exciting.

In recent years, communication has become programmable. Converting communication to software has enabled enterprises to buy and deploy voice, text, VoIP, email, and so on “as a service.” Need 50 voice lines? Just click to order them instantly from inside your existing business applications.

Clearly, the emergence of Duplex et al. has the potential to supercharge communication platform as a service (CPaaS). If there are millions of human digital assistants, every one of them will need its own voice line. In fact, they will probably need more than one. Yes, digital assistants may make analog voice calls, but they can make any number of them simultaneously.

This is a startling idea. We can imagine a robo-sender firing off thousands of emails at the same time. But voice calls? Genuine, audible conversations? It’s a head spinner.

And in theory, it could go further. In the Duplex demo, the bot acted on specific instructions to perform one task. But it’s possible an assistant could act semi-autonomously. Users could instruct it to renew annual subscriptions as they come up, for example.

But why stop there? Why not give an assistant permission to peruse bank statements and make informed purchases based on past behavior?

It feels like science fiction. And anyone who’s seen the movie Her — spoiler alert — will see parallels with the scene in which Joachim Phoenix is startled to realize his digital assistant lover Samantha is talking to 8,316 other guys at the same time.

That was a profound moment. Prescient too. But it was probably only the CPaaS execs in the audience who were thinking, “8316 guys? Wonder who’s powering those connections.”

Rob Malcolm is the vice president of CLX Communications, a CPaaS provider.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.