In a topsy-turvy world, we humans often take for granted our ability to pick up and handle objects of different weights and sizes. We’ve probably all had the experience of picking up a suitcase that we assume is quite heavy and realizing that it’s completely empty.
A key reason we can pick up and manipulate objects we’ve never seen before is our fingertips, which can quickly size up not just the weight of an object, but other nuanced characteristics such as pressure, friction and shape.
Image credit: MIT
In contrast to the advanced tactile insights of humans, the robots that we’ve spent decades developing don’t even have the tactile skills of toddlers. This matters as systems with so-called “haptic sensors” are increasingly used not just in factories, but stores, offices and even people’s homes.
A team from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) is working on this problem, and have shown in a new paper that they can imbue a robot with a sophisticated enough sense of touch that it can grip a new object between two fingers, estimate its properties by the way it feels, and swing it up to virtually any desired pose, all without looking.
The team’s “SwingBot” system can swing an object into a given pose with an error of only 17 degrees on average. (Without tactile sensing, it would have two times less accuracy.) The robot judges the object’s friction and heft by jiggling it between the fingers and uses that information to plan the timing and trajectory of the swing.
“Many methods for enabling robot manipulation involve a camera, but there are a lot of vital nuances that you miss out on if you’re picking up an object-based only on vision,” says CSAIL graduate student Shaoxiong Wang, co-lead author on a new paper about SwingBot that he wrote with collaborators that include MIT professor Edward Adelson. “With our approach, we can use touch to estimate an object’s physical properties.”
The researchers say that such a system would be useful for organizations seeking more cost-effective tactile robots. The CSAIL team used GelSight, a haptic sensor Adelson developed that offers high-definition resolution with low-cost components.
Written by Adam Conner-Simons, MIT CSAIL