A touch-sensitive gadget with the sensing panel on its back, instead of the screen, is being developed by US researchers. Using your fingers behind the device allows a firmer grip and more accurate performance without obscuring your view of the screen, they say.
Multi-touch interface technology hit the commercial market this year, with the US release of Apple's iPhone in June. But the iPhone's touchscreen is not perfect, says Daniel Wigdor of Mitsubishi Electric Research Labs (MERL) and the University of Toronto, Canada.
"As soon as you put your hands on the display you [obstruct] the screen," he says, something he calls the "occlusion problem". Users of iPhones have other problems too, he adds. "Multi-touch devices detect the entirety of the touch area,” Wigdor continues. "That's what we call the 'fat finger' problem."
The two problems combined make it difficult to select precise targets, such as the keys on a virtual keyboard.
Wigdor at MERL, and Patrick Baudisch at Microsoft Research and their co-authors have a novel solution to these problems. Their prototype, LucidTouch, is a device that can be held comfortably in two hands, similar to the PlayStation Portable (see video, right).
It has a large touch-sensitive LCD screen, similar to that used in the iPhone. But it can also be controlled using a touch-sensitive interface on its rear surface, a solution to the occlusion problem.
When using the rear touch interface, the user's fingers appear as shadows on the screen, giving the illusion they are holding a transparent device. LucidTouch highlights the active point of each finger with a small green dot, removing the fat finger problem. "We're trying to address the problem of occlusion by giving the user an idea of their input, while still being able to see through the hands to the screen," Wigdor says.
"I like the idea,"” says Alistair Edwards at the University of York. "It clearly addresses the occlusion problem, and using dots 'attached' to the fingers also attempts to address the fat-finger problem."
But Edwards would like to see the technology pushed further. "If I had one of these, I would be wanting to try out all sorts of ideas. What about 'throwing' objects [from one hand to the other]?," he says. "Also, why not build in some orientation detectors, so that you can also manipulate objects by tilting the device?"
"We're just researchers," says Wigdor, explaining that his team just explore new ideas that are taken on by more commercially minded groups if they are interested. "But from a research point of view, we are looking at improving the user interaction."
A more pressing concern is how to slim down the LucidTouch design. The rear touch interface is currently provided using an unwieldy "boom camera" strapped to the back of the device that records finger movements.
Using a souped up version of a touch panel like the iPhone's screen is one option. But there are other alternatives. "We could use LEDs to record the movement, because they are both emitters and sensors," says Wigdor.
"You would have the back of the device covered with them, half turned on and half turned off. Then the light from the LEDs that are on would be reflected from the hands and back onto the LEDs that are off." That would generate a charge that could show where the hands are, Wigdor says.