Alexa, how long will it take me to get to Chicago from here?”

This question is not profoundly complex. In many ways, it’s more like a search query, and it shouldn’t have anything to do with A.I. Yet there are several important data points involved, and the Alexa assistant that runs on my Amazon Echo speaker doesn’t know what to do. When I ask this question, the bot tells me to set up my traffic route for a daily commute.

Using the Google Assistant bot on the new Google Allo messaging app, things are a little different. Questions like this lead to much better results. Google has spent the last 18 years parsing the data for search queries. The Assistant knows I live outside of Minneapolis. It can easily calculate the miles and therefore the travel time.

I am trained as a journalist, so I don’t have any formal training in A.I. That said, I’ve been testing hardware and software for the past 15 years. I know a good bot when I see one. They are helpful, provide a seamless experience, think through things quickly, have some practical features, and can mimic human behavior. Use whatever definition of A.I. you want, but the truth is, the consumer understands A.I. to be a thinking machine. When the machines doesn’t think, it’s not A.I. As an example: When I tell Alexa to turn off the lights at night, that’s helpful. When I ask her to tell me the population of Austria, that’s a neat trick. However, when I ask her to name the smallest country by population, she balks. Why is that? The Google Assistant knows the answer (the Vatican), because it’s smarter — it has parsed my query for other users a million times since 1997.

One of the key differences between the Assistant and Alexa is that Google understands context. If I ask about the Minnesota Vikings and then follow that query by asking about “the quarterback,” the Assistant will know I’m interested in hearing more about Sam Bradford. When Alexa tells me it is Teddy Bridgewater, she is technically correct (but woefully out of touch) — e.g., he is injured by still considered the main quarterback. However, when I ask “When was he born?” Alexa doesn’t respond.

Even more importantly, the Assistant can understand the context of a human-to-human relationship. While discussing the Vikings on the Allo app with a friend today, I tagged @Google and asked about the traffic around the game. Amazingly, this worked. The traffic is congested today. The Assistant knew I was talking about the Vikings game, which takes place in Minneapolis. This contextualization tells the user there is some thought behind the machine, that the Assistant is not a glorified search engine. It helps that the Allo app also understands me when I press the microphone button and talk. It helps that Google already knows where I live. Alexa has dozens and dozens of skills — I’ve tested everything from checking the front door lock in my house to ordering a pizza — but the bot is not that adept at free-form conversation, contextualizing, or handling search queries. It has a long way to go before it starts acting like an A.I. machine.

That said, I’m sure Amazon is continually working to improve this bot. I like that Alexa has tentacles around so many things — audiobooks, shopping, and even narrative-based games — which all hits home my point about what makes a machine intelligent. After a few weeks, you get used to talking to Alexa, and you start seeing the bot as a trusted accomplice.

Apple Siri, the Google Assistant, and Alexa are the three best examples of A.I. bots today, at least in terms of providing real assistance and building consumer trust. There are far better bots (IBM Watson comes to mind), but I’m starting to use the Assistant almost daily, and I already rely on Siri (when I’m mobile) and Alexa (when I’m at my desk). They will only get smarter and more helpful with time. For now, I’m turning to the Assistant for the toughest and most challenging questions. I’ll keep reporting on how that all goes.

Get more stories like this on TwitterFacebook