Synthetic Synesthesia of the Sixth Sense™

6ense™ (SixthSense™)

SixthSense (abbr. "6ense", as in SIXense) describes a variety of wearable technologies (neckworn, headwork, wristworn, etc.), including a device that comprises a neckworn pendant containing both a camera and a data projector capable of projecting onto 3D subject matter such as the wearer's own hands or other 3D objects in front of the wearer.

The projector + camera combination augments the physical world with computer-generated/mediated content and allows the wearer to interact using a Natural User Interface (e.g. hand gestures or real physical objects) to interact with that content.

It was invented by Steve Mann, and, later, further refined (and popularized) by Pranav Mistry, both Mann and Mistry being PhD students in the MIT Media Lab.



Steve Mann using Sixth Sense in his everyday life in 1998. The hand gesture is recognized (in some applications by the wearable computer, and in some applications by another person being collaborated with) without the use of any colored tape or markings, which were required (for computer recognition) on Mann's earlier (IEEE Computer, Vol. 30, No. 2, February, 1997) apparatus, as well as on Mistry et al.'s 2009 apparatus.

Earlier version of 6ense was headworn, and included gesture-sensing:

Neckworn 6ense:
In the 1990s Mann developed a wearable computer system with a webcam and projector having infinite depth-of-focus so that it could project onto the wearer's hands (and gestures) or arbitrary 3D objects and surfaces in front of the wearer.

History

SixthSense originated as a variety of wearable technologies including headworn, neckworn, wristworn, etc., including the neckworn projector+camera system developed by Media Lab student Steve Mann[1]. Mann originally referred to these wearable technologies as "Synthetic Synesthesia of the Sixth Sense"[3][4]. In the 1990s and early 2000s, Mann used this project as a teaching example, and taught several hundred students how to build the neckworn SixthSense system, as part of the undergraduate teaching curriculum at University of Toronto[5]. In the 1990s the early aremac did vector graphics rather than raster graphics[2], but a raster graphics version based on a miniature wearable micromirror projector was developed in 2001, which could project onto the wearer's hands, other objects, or the floor or ground in front of the wearer, so that it could work with hand gestures or foot gestures[3].

Applications

One application of SixthSense was Computer Supported Collaborative Living, e.g. telepresence, and remote interaction with others. One example of this was Telepointer in which a Natural User Interface was developed using a second projector and camera.

Other applications included interaction with a wearable computer (in the absence of one or more remote parties). SixthSense implemented various applications such as computing the size of objects, or query on objects, done using a reach and grab or ``hug'' gesture. For example, reaching with open arms towards a tree provides a lookup of the tree and attempts to recognize the bark on the tree, or attempts to use a BARKode identifier on the tree.

A paper viewing application lets the user navigate a sheet of paper, displayed using paper-bending gestures.

A drawing application lets the user draw on any surface by tracking the fingertip movements of the user's index finger. (This work was published by Mann in IEEE Computer, 1997, but then required colored tape on Mann's index finger).

By 1998 SixthSense also recognized non-taped finger gestures.

A zooming gesture works by boxing a scene with two cropping ``L'' shapes and pulling the hands apart or moving them closer together.

A xylophone gesture allows the wearer to use any surface as a musical instrument by implementing an idioscopic Natural User Interface.

The wearer can interact with any floor, wall, board, or other surace, and and ``flashback'' photos (s)he has captured.

SixthSense also lets the wearer draw in the air, as if making a long-exposure photograph, moving a finger (or light source or other object).

SixthSense augmediates real-world objects or body parts with the aremac by projecting indicia (text, graphics, symbols, etc.) onto the objects or the wearer's own hands, for example.

SixthSense can recognize pages of a book and play back live video instructions or details or examples as if they were affixed to the paper as a foldable display.

References

See also: