When the iPhone debuted with its multitouch screen, people were amazed. Instead of poking with a stylus at the old-fashioned screens on Palm Pilots, users could use their fingers to drag, flick and pinch to zoom.
But touch screens can do even more — by listening — said Chris Harrison, co-founder of a startup called Qeexo. The company has just debuted new technology called FingerSense that adds a tiny acoustic sensor inside a smartphone to listen for and measure the vibrations of the object that touches it. (The technology can be used in tablets, too.)
The modified screen is easily able to tell, for example, the difference between a fingertip and a knuckle tap — just like any of us tell the difference between drumming our fingers and knocking our knuckle on a tabletop. FingerSense can also register the tap of a fingernail, which has no effect on a regular touch screen that relies on the electrical disruption caused when someone touches the capacitive screens found on smartphones and tablets.
In a demonstration at the New York Tech Meetup this week, Harrison showed how the technology could trigger additional functions. When touching the screen of a modified Samsung Galaxy S III smartphone with his fingertip, it worked like any other capacitive touch screen, allowing Harrison, in this case, to scroll through and select photos to view. But when he tapped the screen using his knuckle, the screen showed a popup menu (analogous to right-clicking a mouse) that brought up options such as emailing the photo or sending it to Facebook.
Because it doesn't rely on the electrical properties of the user, FingerSense can also register other objects. For example, Harrison used a stylus to draw on the screen of the smartphone, even though a standard Galaxy S III isn't designed to work with a stylus. If touching the screen makes any noise or vibration, the screen can read it.
Trying it myself, I found the screen to be extremely sensitive. Even when I moved my finger ever-so-slowly toward the screen, scarcely touching it — much more gingerly than someone would in real life — the technology could generally tell the difference between a fingertip, a knuckle, a fingernail and a stylus. When I touched the screen as I normally would with any other smartphone, it interpreted the gesture correctly every time.
Harrison is now pitching this technology to device makers, so he didn't provide exact details of how it works. But he did tell me that it uses a standard, off-the-shelf acoustic sensor that is small enough to fit inside the jam-packed innards of a smartphone — though he declined to say whether it is attached to the screen or placed elsewhere. He didn't give any indication of whether any company has decided to definitely use the technology, and when it might appear in products.
[See Also: Samsung Galaxy Note 10.1 Turns Tablets Into PCs ]
The choice of a Galaxy S III for his test model may not be an indicator of the companies with which Harrison is working. But it wouldn't be a far-fetched partnership. Samsung has equipped its Galaxy Note smartphones and Galaxy Note tablets to work with an included stylus, called an S Pen (which uses electromagnetic rather than acoustic technology), to allow sketching as well as handwriting on a touch screen. And that seems to be a technology that Apple — embroiled in patent lawsuits with Samsung — can't claim as its invention.