Check out all the on-demand sessions from the Intelligent Security Summit here.

Toyota released a movie that shows what the future will be like for self-driving cars. Let’s get one thing out of the way first: There’s no way anyone will ever dress like that. And the car looks a bit too much like it’s from a sci-fi movie. However, I did pay attention to the voicebot they used and how it provides some contextual awareness for the driver. It’s a good peek into where these bots might be heading in the future, even if the car itself is a little too “out there” for me.

1. Conversation that seems real

I was blown away by one thing in the ad, and it wasn’t when the car swerved suddenly to avoid some rocks. The bot talked like an everyday person. That’s coming soon. Years ago, bots would use a robotic voice, something like the voice that could read documents on a Mac. With Alexa and Google Home’s voicebot (called the Google Assistant), the speech patterns sound normal, but the conversation is a bit stilted at times. Alexa still sounds like a bot. In the ad, the voice sounds like a friend sitting next to you, which will help create a free-flowing dialogue.

2. The bot can see you

In one short segment, the driver leans forward too much — he’s obviously supposed to be a dufus — and speaks to the bot. “Noah … you can lean back in your seat,” the bot responds. That’s exactly what is missing from many of the bots today, including Amazon Alexa. Soon, bots will not only see our face but will be able to look around the room. They’ll identify other people, notice what we’re eating, and recognize how much light is in the room, and respond in kind.


Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.

Watch Here

3. Contextual awareness in spades

In another segment of the full ad, the bot can tell the driver has “big hair” and makes a joke about finding a booth for his date big enough for his hairstyle. I don’t see AI getting anywhere close to that level of understanding soon, but get there. An AI could follow your Facebook posts and know you’re going on a date. It would see you looked up directions to a fast food place and suggest a more elegant dining establishment. It could sense, mostly by watching your head movements and facial expressions, that you’re nervous and make a joke to calm you down.

4. An element of surprise

Friends don’t always divulge every ounce of information. They say things like, “Let’s go eat at my favorite pizza place,” but don’t mention the name. When you arrive, you find out you both like the same place. That deepens the friendship because it is not always about factual information exchange. There’s some spontaneity, some empathy. In the ad, the bot picks a place to eat but doesn’t say what it is. I liked that because it shows the bot “knows” the driver. As the self-driving car navigates the terrain, you slowly realize where you’re headed. Maybe that could turn into a nightmare, especially if you don’t like the pizza place, but it’s a cool concept.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.