It consists of certain commonly available components,which are intrinsic to its functioning. These include acamera, a portable battery-powered projection systemcoupled with a mirror and a cell phone. All thesecomponents communicate to the cell phone, which acts asthe communication and computation device. The entirehardware apparatus is encompassed in a pendant-shapedmobile wearable device. Basically the camera recognisesindividuals, images, pictures, gestures one makes withtheir hands and the projector assists in projecting anyinformation on whatever type of surface is present infront of the person. The usage of the mirror is significantas the projector dangles pointing downwards from theneck. To bring out variations on a much higher plane, inthe demo video which was broadcasted to showcase the prototype to the world, Mistry uses coloured caps on hisfingers so that it becomes simpler for the software todifferentiate between the fingers, demanding variousapplications. The software program analyses the videodata caught by the camera and also tracks down thelocations of the coloured markers by utilising singlecomputer vision techniques. One can have any number of hand gestures and movements as long as they are allreasonably identified and differentiated for the system tointerpret it, preferably through unique and variedfiducials. This is possible only because the ‘Sixth Sense’device supports multi-touch and multi-user interaction.
Captures an object in view and tracks the user’s handgestures. It sends the data to smart phone. It acts as adigital eye, connecting you to the world of digitalinformation.
The Projector projects visual information enablingsurfaces and physical objects to be used as interfaces.The project itself contains a battery inside, with 3 hoursof battery life. A tiny LED projector displays data sentfrom the smart phone on any surface in view–object,wall, or person. Pocket projector Pk101 from Optoma isused. It is Suitable for mobile usage
The usage of the mirror is significant as the projector dangles pointing downwards from the neck. The mirror isused to focus projections on surface.
A Web-enabled smart phone in the user’s pocket processes the video data. Other software searches theWeb and interprets the hand gestures .Nokia n95 smart phone is used (running Symbian O.S s60 edition). It hasmultitasking capability. Built-in camera providesexecution of both Gesture tracking engine and Gestureenabled application
Marking the tip of user’s fingers with red, yellow, green,and blue tape helps the webcam recognize gestures. Themovements and arrangements of these makers areinterpreted into gestures that act as interactioninstructions for the projected application interfaces.
Applications are implemented using JAVA 2 MICROedition, a Java platform designed for embedded systemswhere target devices range from industrial controls tomobile phones.Computer vision library is written insymbian C++ (used in Gesture tracking).The software for the sixth sense prototype is developed on a MicrosoftWindows platform using C#, WPF and open CV.Thesoftware works on the basis of computer vision.A smallcamera acting as an eye, connecting us to the world of digital information. Processing is happening in themobile phone, and basically works on computer visionalgorithms.Approx 50,000 lines of code are used
Kinds Of Gestures Recognized
MULTI-TOUCH GESTURES are like the ones we seein the iphone – where we touch the screen and make themap move by pinching and dragging.FREEHAND GESTURES are like when you take a picture or a
gesture to start the projection on thewall. ICONIC GESTURES drawing an icon in the air.Like, Whenever we draw a star, show us the weather details. When we draw a magnifying glass, show us themap. This system is very customizable. We can make our own gesture which our sixth sense device can understand.We can change the Sixth Sense to our need.
ApplicationsMake a call
The Sixth Sense prototype can be used to project akeypad onto your hand and then use that virtual keypadto make a call.
Call up a map
With the map application we can call up the map of our choice and then use thumbs and index fingers to navigatethe map
The user can draw a circle on your wrist to get a virtualwatch that gives you the correct time
Multimedia reading experiences
Sixth Sense can enrich a user’s multimedia experiences.It can be programmed to project related videos ontonewspaper articles you are reading.
Retrieving information from the Web when you're on the go can be a challenge. To make it easier, graduate student Pranav Mistry has developed SixthSense, a device that is worn like a pendant and superimposes digital information on the physical world. Unlike previous "augmented reality" systems, Mistry's consists of inexpensive, off-the-shelf hardware. Two cables connect an LED projector and webcam to a Web-enabled mobile phone, but the system can easily be made wireless, says Mistry.
Users control SixthSense with simple hand gestures; putting your fingers and thumbs together to create a picture frame tells the camera to snap a photo, while drawing an @ symbol in the air allows you to check your e-mail. It is also designed to automatically recognize objects and retrieve relevant information: hold up a book, for instance, and the device projects reader ratings from sites like Amazon.com onto its cover. With text-to-speech software and a Bluetooth headset, it can "whisper" the information to you instead.
Remarkably, Mistry developed SixthSense in less than five months, and it costs under $350 to build (not including the phone). Users must currently wear colored "markers" on their fingers so that the system can track their hand gestures, but he is designing algorithms that will enable the phone to recognize them directly. --Brittany Sauser
1. Camera: A webcam captures an object in view and tracks the user's hand gestures. It sends the data to the smart phone.
2. Colored Markers: Marking the user's fingers with red, yellow, green, and blue tape helps the webcam recognize gestures. Mistry is working on gesture-recognition algorithms that could eliminate the need for the markers.
3. Projector: A tiny LED projector displays data sent from the smart phone on any surface in view--object, wall, or person. Mistry hopes to start using laser projectors to increase the brightness.
4. Smart Phone: A Web-enabled smart phone in the user's pocket processes the video data, using vision algorithms to identify the object. Other software searches the Web and interprets the hand gestures.
Credit: Sam Ogden
2009 TR35 Winners
Using “black silicon” to build inexpensive, super-sensitive light detectors
Minimal wireless-networking protocols allow almost any device to communicate over the Internet
Defeating would-be hackers of radio frequency chips in objects from credit cards to pacemakers
Preserving information for practical quantum computing
An intuitive 3-D interface helps people manage layers of data
“Painting” nanowires into electronic circuits
New cameras and algorithms capture the potential of digital images
A simple, wearable device enhances the real world with digital information
Inexpensive chips and sophisticated software could make microscope lenses obsolete
World’s smallest resonator could lead to tiny mechanical devices
Defeating malware through automated software analysis