If you’re tired of squinting into your tiny iPod or phone screen, then how about switching to a whole new system that uses your skin’s surface as a screen? Enter “Skinput,” a new prototype that allows you to use your skin as both a touchscreen and an input device.
Researchers from Carnegie Mellon University and Microsoft’s Redmond lab found that jabbing at body parts like the forearm created acoustic waves that could be detected higher up in the arm by a bunch of sensors strapped onto an armband. As bone densities and the amount of soft tissue varies at different locations in the body, the vibrations produced by the jabbing motion were different at various locations. So, if you press a bunch of “buttons” being projected on your skin by a tiny projector on the armband, the device can track your inputs precisely.
Their software matches sound frequencies to specific skin locations, allowing the system to determine which “skin button” the user pressed. The prototype system then uses wireless technology like Bluetooth to transmit the commands to the device being controlled, such as a phone, iPod, or computer.
20 volunteers that tried out the prototype said it was easy to use, and the researchers are due to present their paper (pdf) in April at the Computer-Human Interaction conference in Atlanta. Here’s a video explaining how Skinput works:
Discoblog: New Device Can Turn Your Kitchen Table Into a Touchscreen
Discoblog: Why Our Oily Fingers Can Never Soil the iPhone’s Pristine Screen