Alternative control methods are one thing we keep coming back to here on NRM; whether it's interacting with a videogame using EMG sensors placed all over the body, or controlling on-screen actions through gestures sensed through a wrist-mounted device. The 'Magic Finger', a research project from Autodesk Research in collaboration with the University of Toronto and the University of Alberta, is just the very latest to catch our eye.
Impossibly small, it's a device that in practice can turn any surface into a touch screen. Utilising a micro RGB camera and an optical sensor (the kind you see if you lift you mouse off the desk), the 'Magic Finger' can also 'read' the texture of the surface (to an accuracy of 98.9%, we're told) it is interacting with, further opening up new possibilities by allowing contextual actions to be assigned to certain surfaces. In its video, the team explains the device can also read 2D data matrix codes, and describes the “simple authoring environment” used to map actions to specific surfaces or actions.
Though the research is still in its very early days, it's not difficult to see how such a device could be used if ever it were to come to fruition. Those behind the research already have their eyes on the likes of Google's Project Glass – as they point out, that particular project is a neat, wearable HUD system so far without a novel input system – but other implementations would surely be just as exciting. We'll keep our eyes on this one.