Washington: MIT researchers have developed a soft 3D printed robotic hand that is so responsive it can safely pick up and handle objects that are incredibly delicate, such as an egg or a compact disc.
Robots have many strong suits, but delicacy traditionally hasn’t been one of them. Rigid limbs and digits make it difficult for them to grasp, hold and manipulate a range of everyday objects without dropping or crushing them. Researchers from Massachusetts Institute of Technology (MIT)’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have discovered that the solution may be to turn to a substance more commonly associated with new buildings and Silly Putty: silicone.
Recently, researchers demonstrated a 3-D-printed robotic hand made out of silicone rubber that can lift and handle objects as delicate as an egg and as thin as a compact disc. Its three fingers have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items. “Robots are often limited in what they can do because of how hard it is to interact with objects of different sizes and materials,” CSAIL Director Daniela Rus said. “Grasping is an important step in being able to do useful tasks; with this work we set out to develop both the soft hands and the supporting control and planning systems that make dynamic grasping possible,” said Rus.
Researchers said that soft robots have a number of advantages over “hard” robots, including the ability to handle irregularly-shaped objects, squeeze into tight spaces, and readily recover from collisions. “A robot with rigid hands will have much more trouble with tasks like picking up an object. This is because it has to have a good model of the object and spend a lot of time thinking about precisely how it will perform the grasp,” graduate student Bianca Homberg said.
Soft robots represent an intriguing new alternative. However, one downside to their extra flexibility is that theyoften have difficulty accurately measuring where an object is, or even if they have successfully picked it up at all. That is where the CSAIL team’s “bend sensors” come in. When the gripper hones in an object, the fingers send back location data based on their curvature.
Using this data, the robot can pick up an unknown object and compare it to the existing clusters of data points that represent past objects. With just three data points from a single grasp, the robot’s algorithms can distinguish between objects as similar in size as a cup and a lemonade bottle.