Samsung’s latest patent filing has us looking at the possible future of mobile phone input technology. This hasn’t exactly been unheard of before (with Sony Ericsson and Nokia both offering their own similar versions to it), but this is the real deal and before today something like this would be highly improbable. With Samsung’s knack for being one of the fastest innovators lately, it won’t surprise us if they ever manage to pull this one off.
Their latest patent filing is pretty straightforward. The patent abstract is frank, and simply states that it is about:
“A handheld gesture recognition control apparatus and its method are provided for a mobile phone. The input method of the present invention includes collecting a plurality of images; storing the images as control images; mapping the control images to corresponding control commands; capturing an image taken by a camera as a current image; comparing the current image to the control images; selecting one of the control images as a target control image according to a comparison result; extracting a control command mapped to the target control image; and executing the control command.”
So simply speaking, Samsung aims to create the world’s first cellphone sign language recognition technology, which it plans to implement as a new way to input and control items on mobile phones.
This new technology will supposedly be based on hand gesture controls that will be fed into the device via its camera, taking photos of gestures as they are done in sight of the camera lens. The taken photos of the hand gestures, meanwhile, will be analyzed and compared with preloaded photos on the system of the phone with their own corresponding control commands.
For example, let’s say you hold your hand open in front of your cameraphone with your palm facing the cameraphone’s lens. If you do a “flipping” gesture, turning your hand around so that the back of your hand could then face the lens, virtually you just made a command that would flip any item that is currently on-screen. I can see some very nice applications of this technology, although I don’t think it would be easy to adapt and learn all the gestures concerning this new input system at all.
I’d like to call this the next-level of multi-touch, but this new hand gesture control doesn’t even support multi-touch gestures itself in the first place. Plus, it will definitely not have any haptics feedback, unless you can find a way to install local haptics on the air around you.
If you’re a fan of playing air guitar, though, I’m sure you’ll love where this idea is getting. I also hear that they’re coming up with a Braille input system next.
If you liked the post, you might find these interesting too:
- Control your Sony Ericsson phone with the hand gestures
- Remote input sleeve from Nokia
- Google’s Project Glass could be operated via laser projected virtual keypad/keyboard
- Apple is looking beyond Multi-Touch
- VR/3D Controller for your Sony PS3