Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
“John, did you remember it’s your anniversary?”
This message did not appear in my inbox and Alexa didn’t say it aloud the other day. I do have reminders on Facebook , of course. Yet there isn’t an AI powering my life decisions yet. Some day, AI will become more proactive, assistive, and much smarter. In the long run, it will teach us to have more empathy — the great irony of the upcoming machine learning age.
You can picture how this might work. In 2033, you walk into a meeting and an AI connects to your synapses and scans the room, a la Google Glass without the hardware. Because science has advanced so far, the AI knows how you are feeling. You’re tense. The AI uses facial recognition to determine who is there and your history with each person. The guy in accounting is a jerk, and you hardly know the marketing team.
You sit down at the table, and you glance at an HUD that shows you the bio for a couple of the marketing people. You see a note about the guy in accounting. He sent an email out about his sick Labrador the week before. “How is your dog doing?” you ask. Based on their bios, you realize the marketing folks are just starting their careers. You relax a little.
I like the idea of an AI becoming more aware of our lives — of the people and circumstances around us. It’s more than remembering an anniversary. We can use an AI to augment any activity — sales and marketing, product briefings, graphic design. An AI can help us understand more about the people on our team, including coworkers and clients. It could help us in our personal lives with family members and friends. It could help in formal situations.
Yes, it sounds a bit like an episode of Black Mirror. When the AI makes a mistake and tells you someone’s family member died but gives you the wrong name, it will lead to an awkward interaction. And that will happen. But I also see a major advantage in having an AI work a bit like a GPS. Today, there’s a lot less stress involved in driving in an unfamiliar place. (There’s also the problem of people not actually knowing how to read a map and relying too much on a GPS, but that’s another story.) An AI could help us see another person’s point of view — their background and experiences, their opinions. An AI could give us more empathy because it can provide more contextual information.
This also recalls the movie Her, in which the technology is personified as an understanding voice. I see the AI as learning more about our lives and surroundings and then interacting with the devices we use. The AI knows about our car and our driving habits, knows when we normally wake up. It will alert people when we’re late to a meeting and send us information that is helpful for social situations. We’ll use an AI through a text interface, in a car, and on our computers.
This AI won’t provide a constant stream of information; instead it will offer the right amount — the amount it knows we need to reduce stress or understand people on a deeper level. “John likes coffee, you should offer to buy him one” is an example. “Jane’s daughter had a soccer game last night, ask how it went.” This kind of AI will help in ways other than just providing information. It will be more like a subtext to help us communicate better and augment our daily activities.
Someday, maybe two decades from now, we’ll remember when an AI was just used for parsing information. We’ll wonder how we ever used AI without the human element.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.