Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.
What if any object in the world, not just smartphones and tablets, could know when and how you were touching them? If a team working at Disney Research and Carnegie Mellon University continues to make progress, soon we may have smarter chairs, doorknobs, bathtubs, and even living things.
Using the researchers’ new technology, called Touché, we could sense what is touching an object (human or fork?), how it is being touched (pushing, pinching, grasping), and which body part is touching it (hands, elbows, number of fingers). That means a flat surface could recognize if you are standing, sitting, or Tebowing on it.
Touché operates on the same general principle as the capacitive sensor in your touchscreen phone. The difference is that where most smartphones only capture one frequency, which they interpret as touching or not touching (plus position data), Touché senses complex configurations by sweeping over a wider range of frequencies. The technique is called swept frequency capacitive sensing (SFCS), and it requires processing a much larger amount of information than traditional capacitive sensors — something that has become easier with today’s faster and cheaper microprocessors.
The technology needs only one electrode, which opens the applications to almost any object that can conduct electricity. That means humans can become a sensor and, even cooler, the different parts of a body could be detected based on their capacitive properties. Even water can be turned into a touch sensor.
One potential application is a desk surface which is all touch-sensitive, allowing you to manipulate objects on your screen with far greater fidelity than even today’s best tablets. Fingers may be imprecise compared to a mouse pointer or a stylus, but that’s only because the tablet is small. Give your hands the entire surface of your desk to work on, and the results might be beyond our current imagination.
The researchers came up with their own neat concept uses: a music player that’s controlled by touching your own hand (hit a pinky to pause, two-fingers on the palm to play); doorknobs that lock, trigger lights, or display messages based on how you touch them; a sofa that turns on the TV when you sit down, then turns down the lights when you recline; and bowl of cereal that frightens a child who uses the wrong utensil.
Another use the research team is considering is to control and access our increasingly smaller computer interfaces, as well as the elimination of traditional input devices like keyboards or mice. Tools which currently need to be large enough to provide space for a manipulable user interface could continue to shrink.
“Devices keep getting smaller and increasingly are embedded throughout the environment, which has made it necessary for us to find ways to control or interact with them,” team member Chris Harrison said in a statement.
The team will be presenting their research at CHI 2012, the Conference on Human Factors in Computing Systems, in Austin next week. They have already been recognized with a much sought-after Best Paper award.
Check out this quick demonstration of the technology in action:
[youtube http://www.youtube.com/watch?v=E4tYpXVTjxA]
Photo credit/Flickr
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.