While I didn’t like Minority Report as a movie per se, it’s visions of the future and various human/computer interfaces still fascinate me.
Most of them will require technologies way beyond of what is available today, but getting from here to there has to start somewhere. And quite a few companies are already working on the bits and pieces of Minority Report’ish tech in their R&D labs.
Here’s the latest interesting bit that Microsoft Research has been working on: a “Wearable Electromyography-Based (EMG) Controller” which can be worn or be temporarily attached to the user’s body and provide Human-Computer Interface (HCI) that allows him to control and interact with computing systems via electrical signals generated by the movement of various body muscles. Or, in other words, the device which provides wearable muscle-computer interface.
Now, the Electromyography- measuring of muscle electrical activity during muscle contractions is nothing new, and has been widely used in medical fields. There are two ways of measuring EMG signals. The first, and the most accurate, requires sticking needle electrodes into your muscles. The second uses the external sensors placed on your skin. But, because electrical muscle signals then have to travel through body fat and skin, and there’s a lot of extraneous noise from other muscles , skin movement, environment, etc; that starts interfering with them, the surface EMG is a pretty complicated thing to do. You need to scrub the skin from dead sells, carefully place electrodes in the exact locations for best signal strength, etc;.
All of those things make EMG a no go for human-computer interface purposes. Or at least it was a no go, until Microsoft thought of the way to reliably register electrical activity of muscles in a natural environment. Their trick – spreading a lot more EMG sensors then was usually required. Then using “automated positional localization process” to identify and select a subset of some or all of the sensor nodes, that are in appropriate position to collect electrical signals corresponding to particular user gestures and movements.
The Wearable Electromyography-Based controller can be implemented in various forms, including wearable devices or articles of clothing. For example it can be a part of an armband, a wristwatch, eyeglasses (with sensors in a frame), a shirt, gloves or any other article of clothing a gadget worn by user. Just put on a gadget or clothing with the embedded sensors and controller, and you can control any compatible computing device with a flick of your finger or even an eyebrow. Furthermore, since the controller can detect muscle strain and provide immediate ergonomic feedback to the user, it can be used for training to play musical instruments or in various sports.
Well, it’s all in Microsoft research labs for now, and will most likely take years to get to the marketable application level. On the other hand, before Microsoft announced Kinect, such capabilities also seemed far off, Minority Report’ish stuff.
If you liked the post, you might find these interesting too:
- Remote input sleeve from Nokia
- 3D Game Controller for Sony PS3
- Apple’s own project Natal – Kinect like 3D controller
- Sony Ericsson’s Mobile Phone/Universal Remote Control
- Body Fat Measuring Phone from BenQ