Monday, October 17, 2011

CMU Researchers Turn Any Surface Into A Touchscreen

Soon you, too, will be able to talk to the hand. A new interface created jointly by Microsoft and the Carnegie Mellon Human Computer Interaction Institute allows for interfaces to be displayed on any surface, including notebooks, body parts, and tables. The UI is completely multitouch and the “shoulder-worn” system will locate the surface you’re working on in 3D space, ensuring the UI is always accessible. It uses a picoprojector and a 3D scanner similar to the Kinect.

 

The product is called OmniTouch and it supports “clicking” with a finger on any surface as well as controls that sense finger position while hovering a hand over a surface. Unlike the Microsoft Surface, the project needs no special, bulky hardware – unless you a consider a little parrot-like Kinect sensor on your shoulder bulky. While obviously obtrusive, the project is a proof-of-concept right now and could be made smaller in the future.
So far the researchers have tested drawing and “crosshair” interaction with the system and it has worked well on arms, hands, notebooks, and tables. We’re obviously looking at a research project here so don’t expect shoulder mounted Xboxes any time soon, but by gum if this isn’t the coolest thing I’ve seen today.

No comments:

Post a Comment