Is Apple bringing us closer to Minority Report or the Holodeck with Multi-touch data fusion?

NOTE: GrowthBeat -- VentureBeat's provocative new marketing-tech event -- is a week away! We've gathered the best and brightest to explore the data, apps, and science of successful marketing. Get the full scoop here, and grab your tickets while they last.

The Holodeck is a holographic simulation room made popular by the show Star Trek: The Next Generation. Basically, it allowed humans to interact with virtual environments in the ways we interact with the real world: Our senses. While we’re nowhere near that level of computing yet, Apple has a patent that could be considered an early step.

The patent, dubbed “Multitouch Data Fusion” basically takes the concept of multi-touch computing — that is, using multiple contact points to manipulate something on a screen — and fuses it with other means of input. For example, in the diagram below you can see that multi-touch is fused with voice commands to manipulate the object on screen. While moving or resizing an object with your hands, you could also say something like “change color” or “insert text” rather than having to use a cumbersome menu system and multiple hand gestures to do that.

Biometrics is another interesting interaction. Input based around hand size, fingerprint input, body temperature, heart rate and pupil size could change the way you are interacting with the machine. Something like this could change on-screen controls dynamically to suit your individual needs.

Another example of multiple inputs is gaze vector fusion. This is a fancy term for your computer using a camera (such as an iSight on a Mac) to determine your head position and/or where you are looking on a screen. This might be useful on big multi-touch displays so you don’t have to move your arms all over the place to perform an action.

If you think you’ve seen something like this before, you probably saw the Steven Spielberg film Minority Report. In it, Tom Cruise uses his hands and eyes to manipulate data on a wide range of screens.

An extension of this would be facial expression recognition. Imagine if a computer could do something based simply on your mood (or faked expression). In the filing, an example is outlined in which a user is attempting to do one thing but is doing it wrong and getting frustrated, so the computer tries to figure out what the user actually means to do and adapts. Now we’re really starting to get into some advanced stuff.

Most of this stuff is obviously quite a ways off. In reality, just regular multi-touch technology is in its infancy with devices like the iPhone and Microsoft’s Surface computer. The future looks exciting — and soon computers will be able to tell if you look excited.

[photos: CBS Home Entertainment and 20th Century Fox/DreamWorks]


We're studying digital marketing compensation: how much companies pay CMOs, CDOs, VPs of marketing, and more, with ChiefDigitalOfficer. Help us out by filling out the survey, and we'll share the results with you.