Robots like this? That’s nuts.
If mechanical engineer David Hu ruled the world, it would be crawling with robots based on mosquitoes, snakes, and Mexican jumping beans. Hu’s lab studies animal locomotion, but the research goes beyond the traditional slow-motion footage of creatures running. Instead, Hu examines topics like how water striders and rafts of ants stay afloat on water’s surface, the mechanics of giant pumpkins collapsing into amorphous blobs under their own weight, how snakes’ scales affect their slither, the optimal way for furry animals to shake off water, and how mosquitoes survive collisions with comparatively huge raindrops. His group has even analyzed the motion of Mexican jumping beans, which is due not to some inherent magic in the “beans,” but rather to temperature-sensing moth larvae in hollow seeds. (When the ground heats up, the larvae sense the change in temperature and make their seedy houses twitch into rolling movements towards cooler, shadier ground.) These topics are weird and interesting enough to have garnered Hu’s work plenty of media coverage. But when it comes to earning funding, “weird and interesting” doesn’t always cut it. What’s the practical purpose of this research? Instead of shrugging and saying, “Now we know how mosquitoes struggle out from water droplets 50 times their size! That’s pretty cool!” Hu has come up with a standard one-size-fits-all application. At the end of his papers, he adds that whatever wacky phenomenon he studied could inspire…robots! Read More
MABEL here is a fast lady. At 6.8 miles per hour, she’s the quickest human-like runner in the robot world. She is also the owner of some of the freakiest knees, right up there with Dr. Seuss’s ominous pale green pants and the spider-like prancings of BigDog, the defense robot you hope you never meet coming through the woods at night.
Running robots could transport baggage and participate in rescue operations where rugged terrain makes wheeled vehicles useless, which is why DARPA funds projects like the quadruped BigDog, which is already fairly well developed and has a top speed of about 5 mph. MABEL is a biped bot, which means she’s probably less stable than a quadruped, but more able to potentially stand in for humans in activities like climbing stairs (and certainly a more useful instance of human biomimicry than some robots we could name). Watching her strut her stuff around a little indoor track in the video above, you’ll notice the springing motion of her legs, which is very similar to a human running–both spend about 40% of their time in the air, according to her builders, a team of roboticists at University of Michigan.
Ooo, ahh, and pity the lab downstairs.
[via Kurzweil AI]
Combining the Kinect‘s body-scanning camera with overhead cameras, students at ETH Zurich’s Flying Machine Arena have created a nifty quadrocopter that’s controlled with simple gestures. Move around your right arm and the drone follows a similar path; raise your left arm and it flips; clap and it lands.
Hey, this is sort of like that sci-fi movie where people virtually controlled robots with just their body movements. It’s the weekend and we can’t think of the film—help us out in the comments section.
When DIY robotics and the single-minded persistence of scientists intersect, you get things like the Batcopter. This four-propellered hovering boxspring was designed by Boston University researchers to infiltrate flocks of millions of bats as they bob and weave, recording data about how they manage to avoid colliding while simultaneously (we are confident) scaring the bejeezus out of them.
CES finishes today, and hordes of attendees are making their way to the airport for their return journeys home. But next year, maybe we can all catch the show without the hassle of plane trips, getting lost in the casino while looking for the actual hotel part of the hotel/casino combo, and fighting it out over the last seat on the shuttle bus.
Anybots gives us an alternative with its robotic telepresence system. Their QA robot can be controlled from the comfort of any networked computer, allowing you to wander around and check out booths in spirit, if not in body. The QA has a 4 to 6 hour battery life (which, to be fair, is about my daily limit for walking up down the show halls too). It’s tall height means that it’s built in camera has a normal human eyeline, and it’s body is jointed to allow the QA to express some basic gestures and look down. While watching the camera feed and controlling the QA remotely, a loud speaker and microphone lets you converse with any humans that might be present (the LCD screen on the QA’s chest can display a photo of its operator so that people can know who they are talking to.) The base is similar to a Segway’s, allowing the robot to roll around while remaining upright, and has a built in LIDAR system to automatically detect potential obstacles.
Of course, there is still one thing missing from the QA that gives going to the show in person the edge, at least for now–the current version has no arm with which to grab those free vendor giveaways…