“How about I take my digital world and paint the physical world with that digital information?”
Computing devices are getting smaller and smaller, so that today we can carry computers in our pockets and keep us continually connected to the digital world.
But there is no link between our digital devices and our interactions with the physical world: Information – which is what we are interested in – is confined in two separate worlds: either on a paper, or on a screen, without any possibility to directly access them from both two worlds.
However, Pranav Mistry, a PhD student in the Fluid Interfaces Group at MIT’s Media Lab, has invented a device that allows us to bring the digital information out into the tangible world, and to interact with this information via natural hand gestures!
This new, fantastic device is called SixthSense, and it’s made up of a pocket projector, a mirror and a camera, connected to a mobile computing device in the user’s pocket.
The projector projects visual information, so that we can use any surface or wall as interface, even our hands!
The camera recognizes and tracks the user’s hand gestures and physical objects with the help of the coloured markers (the “visual tracking fiducials”) at the tip of the user’s fingers using computer-vision based techniques.
The movements and arrangements of these coloured markers are interpreted by the software into gestures that act as interaction instructions for the projected application interfaces. And all of this also supports multi-touch and multi-user interaction!
Do you want to watch a real-time weather forecast on your newspaper?
Simply let the device do its work!
Do you want to take a photo? Simply take it with your hands!
Do you want live video news? Simply watch it!
Watch this video to learn more on it! (This video is with multilang subs)