Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.

You walk into a Gap store, take a shirt off the rack, and look it over. Immediately, the TV screen overhead switches to a video of a model wearing that shirt.

Welcome to the first days of Minority Report.

By the end of next year, systems that know what you’re looking at, how long you’re looking at it, the direction you’re walking, and even whether or not you’re in a good mood will start entering the market.

That’s the word from Michael Tusch, CEO and cofounder of Apical, a UK-based maker of advanced image processing technology. He recently painted a picture — exclusively for VentureBeat — of the kinds of applications we can expect in the coming months from his company’s newest vision technology.


Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

Twelve-year-old Apical is not some startup with a big vision. Tusch said his company’s products, which model how the human eye works, are “in at least 50 percent of the smartphones on the planet.”

The new technology from Apical is called Spirit. And, in a way, it does appear to capture the spirit of the people it sees.

The people-based information that Spirit extracts — the essence of a scene — are faces and bodies in the shot, which way they’re looking, and which way they’re walking.

It does this by creating, in real time, a software model of you moving in space. “It’s a kind of digital puppet,” he said. As the company notes on its website, Spirit “virtualizes raw image sensor data into a digital representation of all salient features.”

It’s not you, or your image. It’s data about you, how you look, and how you move in space.

How many people looked at a poster, how long each looked, and how they were standing. What path visitors in a brick-and-mortar store usually take. How long they looked at something before a certain number of them bought it.

But it can also automatically detect how children in a child care center are behaving, whether the apes in a zoo pen are getting out of hand, or if your 90-year-old grandmother, living on her own, is getting around okay.

Tusch said that, while his company’s software isn’t yet designed to do so, Spirit’s data extracts could be used by other software to recognize faces or detect if someone appears to be in a bad mood. So you could use it to automatically catalog all of your phone videos by the people in them or to automatically identify almost everyone your smartphone sees.

And Spirit does this real-time extraction with video as high in resolution as 4K. If a computer used this purpose-built technology, Tusch said, its “performance would be measured in teraflops.”

Yes, it’s Minority Report-like, he said. But he emphasized that the extracted information is abstracted data, not images, and Apical’s implementation is anonymous.

One impact of Spirit — which can be embodied as a dedicated capture and processing chip or as part of a larger chip — is that it could help carry some of the crushing burden that video is placing on the Net.

Instead of shipping security camera video to systems that can process it, for instance, the processing could mostly be handled by the camera or sensor so that less-bandwidth-intensive data, not images, would be transmitted.

The ‘Internet of behavior’

Apical hasn’t yet formally launched its Spirit technology, but the company has prototyped applications in smartphones and Internet-of-things devices.

Apical “will provide the platform,” he said, “and right now we’re working with several [manufacturers] to implement this in silicon.” Tusch predicted that non-smartphone products using the technology, such as sensors, will begin entering the market by the end of next year, and smartphones with this capability will emerge sometime in 2016.

“It will live in my cell phone in the video camera,” Tusch told us, “or it can be enabled as a new kind of product, a ‘smart sensor.’ ”

Such a sensor, he said, would not bother with capturing the video part. It would only capture the useful information. No need to record the video stream of Mary, in other words; just record the information that a person, apparently a female, looked here and walked there.

Apical calls this the “Internet of Behavior.”

“This is Google Analytics for the real world,” he said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.