While we’re not going to let the bots take over just yet, it’s clear that bots are going to meet our needs, offer proactive advice, and serve us. The truth is, we won’t try to understand technology. Technology will try to understand us.

That’s a lofty goal. The digital revolution was about humans becoming accustomed to using computers all day, connecting with each other over social media, and even more arcane activities like learning how to use Photoshop. In the Fourth Industrial Age, technology will slide further behind the curtain into more of an assistive role, one that is not meant to be all about shiny new gadgets and operating system updates. In fact, eventually, the gadget craze will subside. It will be OS Who Cares. We won’t think as much about the next iPhone or the latest Android tweaks; we’ll care about how much the interfaces, hardware, and connections can customize themselves to meet our needs and then step out of the way.

The best example of this is the modern car. I’ve mentioned in the past that technology has stalled out a bit in the automotive sector, based on my testing of around 400 to 500 cars in the past few years. The entire industry is waiting for the robot car to take over. But what will that really look like? As a foundational concept, it means the car will understand what we want. It’s more about personalization than sensors. The car will know we need to get to the mall before it closes. The first time my Toyota sends me a reminder about that and tells me about a traffic problem, I’ll know the “self-driving car revolution” has started, not when it can figure out how to avoid other cars on the highway. I believe that’s the easy part. Sensors in the car identifying a road closure up ahead is one thing; bots knowing I like to shop at Cabela’s and whisking me there is something completely different. Safety is assumed. The next step after that — personalization — is harder.

For intelligent sensors to work, the car has to use algorithms that are based almost entirely on complex math — the car is going at a certain speed, the obstruction ahead is a construction sign, other cars are starting to slow down. However, an intelligent car that understands my needs and motivations, my schedule, and my priorities? That’s an incredible challenge.

It’s similar to how voicebots like Alexa are really only dispensing facts and information. They are not overly personal at this point. Alexa doesn’t know much about me at all — not even my shirt size, which has shrunk a little this summer as I’ve started a daily bike routine. Many of the so-called artificial intelligence functions on the Amazon Echo speaker are really just algorithms (similar to how the sensors work on a car) that can search a database.

Yet the coming revolution is about an AI understanding the human brain — our preferences, our choices, or desires. That will require a Herculean effort. For one thing, my preferences change. Today I’m thinking about biking apparel, tomorrow I’m thinking about going to the beach. An AI will have to adapt, respond, adjust, and customize a thousand times per day. It will need to work like the human brain, constantly making micro-adjustments based on changing variables. A true AI is one that serves us and knows us; we no longer have to know or serve it. We speak and it hears us. We don’t need to learn its parameters, it will learn our parameters.

We’re not there yet, of course. Most of us are still tethered to a smartphone all day. By 2030 or so, bots will become adaptive assistants that learn about our behaviors and fit smoothly into our daily routine. We’ll stop being enamored by tech. Tech will be enamored by us.

Above: The Machine Intelligence Landscape. This article is part of our Artificial Intelligence series. You can download a high-resolution version of the landscape featuring 288 companies by clicking the image.