Techies Hangout Headline Animator

Techies Hangout

Featured Video

Tuesday, May 24, 2011

The Invisible iPhone - A new interface lets you keep your phone in your pocket and use apps or answer calls by tapping your hand.

Top Blogs



The Invisible iPhone

A new interface lets you keep your phone in your pocket and use apps or answer calls by tapping your hand.

Over time, using your smart-phone touch screen becomes second nature, to the point where you can even do some tasks without looking. Researchers in Germany are now working on a system that would let you perform such actions without even holding the phone—instead you'd tap your palm, and the movements would be interpreted by an "imaginary phone" system that would relay the request to your actual phone.
The concept relies on a depth-sensitive camera to pick up the tapping and sliding interactions on a palm,  software to analyze the video, and a wireless radio to send the instructions back to the iPhone. Patrick Baudisch, professor of computer science at the Hasso Plattner Institute in Potsdam, Germany, says the imaginary phone prototype "serves as a shortcut that frees users from the necessity to retrieve the actual physical device."
Baudisch and his team envision someone doing dishes when his smart phone rings. Instead of quickly drying his hands and fumbling to answer, the imaginary phone lets him simply slide a finger across his palm to answer it remotely.
The imaginary phone project, developed by Baudisch and his team, which includes Hasso Plattner Institute students Sean Gustafson and Christian Holz, is reminiscent of a gesture-based interface called SixthSense developed by Pattie Maes and Pranav Mistry of MIT, but it differs in a couple of significant ways. First, there are no new gestures to learn—the invisible phone concept simply transfers the iPhone screen onto a hand. Second, there's no feedback, unlike SixthSense, which uses a projector to provide an interface on any surface. Lack of visual feedback limits the imaginary phone, but it isn't intended to completely replace the device, just to make certain interactions more convenient.


Last year, Baudisch and Gustafson developed an interface in which a wearable camera captures gestures that a person makes in the air and translates them to drawings on a screen.
For the current project, the researchers used a depth camera similar to the one used in Microsoft's Kinect for Xbox, but bulkier and positioned on a tripod. (Ultimately, a smaller, wearable depth camera could be used.) The camera "subtracts" the background and tracks the finger position on the palm. It works well in various lighting conditions, including direct sunlight. Software interprets finger positions and movements and correlates it to the position of icons on a person's iPhone. A Wi-Fi radio transmits these movements to the phone.

VIDEO

In a study that has been submitted to the User Interface Software and Technology conference in October, the researchers found that participants could accurately recall the position of about two-thirds of their iPhone apps on a blank phone and with similar accuracy on their palm. The position of apps used more frequently was recalled with up to 80 percent accuracy.
"It's a little bit like learning to touch type on a keyboard, but without any formal system or the benefit of the feel of the keys," says Daniel Vogel, postdoctoral fellow at the University of Waterloo. Vogel wasn't involved in the research. He notes that "it's possible that voice control could serve the same purpose, but the imaginary approach would work in noisy locations and is much more subtle than announcing, 'iPhone, open my e-mail.' "

0 comments:

Post a Comment

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More