Today, Rethink Robotics, a Boston-based start-up run by famous roboticist Rodney Brooks, has unveiled a manufacturing robot that can safely interact with humans, is easily programmable, and at $22,000 is pretty inexpensive, as industrial robots go. Brooks thinks that the robot, Baxter, which goes on sale in October, could revolutionize manufacturing by creating a new source of inexpensive factory labor.
Are your fingers resting on a slick touchscreen or a wooden desk? The sense of touch and ability to differentiate between textures provide invaluable information about the world around us—and now they may be able to transmit that information to robots and prosthetic hands at well.
Researchers have developed a mechanical “finger” called the BioTac, made up of a rigid central sensor surrounded by liquid and covered in a flexible skin. When the BioTac strokes a surface, that surface’s texture produces unique vibrations in the skin, which has ridges like those seen in a human fingerprint. And the BioTac’s software can interpret those vibrations, along with the force that the surface exerts on the mechanical finger, to identify 117 different textures with a 95 percent success rate. In fact, when it came to distinguishing between textures, the BioTac actually out-performed humans.
[via Pop Sci]
Video courtesy of University of Southern California
A robot has learned a handful of simple words in the same general way that infants do: by listening to the speech, and feedback, of human adults.
Human teachers—who ran the gamut in terms of age, occupation, and experience with kids—worked with a humanoid, toddler-sized robot, describing the colors and shapes on a toy block, as seen in the video above and described in a new study in PLoS ONE. The robot babbled back, learning which combinations of sounds are correct based both on what it had heard and on how the human responded, much like babies do when learning to speak. Giving the robot a childlike form, the researchers suggest, let people interact with it more like they would an actual baby, helping it better model language learning than having people talk to a screen or a box.
It’s pretty cool that the robots could pick up words from human-like interactions. But it’s important to keep in mind that we can only build robots to imitate what it looks like when babies learn, because we don’t know exactly what’s going on in babies’ brains when they learn language—and we certainly don’t understand it well enough to build a program that would work just the same way.
[via Wired Science]