IE 11 is not supported. For an optimal experience visit our site on another browser.

Before Robots Take Our Jobs, They Need to Get a Grip

Robotic grippers and graspers already exist to pick up specific objects in controlled settings, but a robot hand as versatile and dextrous as a human one is still out of reach.
woman working on robot hand
Woman adjusting robot hand.Peter Cade / Getty Images

When Carnegie Mellon professor Howie Choset watches his students take tests, he’s certainly impressed with their ability to solve complex engineering problems. But what really fascinates him is the way they can effortlessly twirl a pencil around their fingers.

That’s because the most mundane things humans can do with their hands — whether it’s writing something down, grabbing an egg out of its carton, or picking an awkwardly stacked book off the shelf — are still big challenges for robots built by engineers like Choset.

“Manipulation, in many ways, is one of the final frontiers in robotics,” he says.

Robots hold a lot of promise as our future factory workers, warehouse stockers, and personal assistants. But the android takeover likely won’t happen until robots have hands as dextrous as ours. While robotic grippers and graspers already exist to pick up specific objects in controlled settings, a robot hand that’s as versatile as a human one is still just out of reach.

The Human Touch

The anatomy of the human hand — and the subtleties of its skillful behaviors — are hard to replicate in a robot for several reasons.

Because of our five fingers and many joints, “we have more than 20 degrees of freedom in our hands,” says Amir Shapiro, a roboticist at Ben-Gurion University of the Negev in Israel. Creating a robotic hand with more than 20 motors to mimic all of those biological parts would weigh down the end of the arm, and create a challenge for the roboticist who has to program the action of each motor for any different kind of movement, Shapiro says.

Related: Is This Burger-Flipping Robot the Future of Fast Food?

Undeterred, some engineers have taken up the “Westworld”-like task of building anthropomorphic robot hands. Last year, two researchers at the University of Washington revealed their amazingly lifelike design that was built from 3D-printed bones based on real skeletal models. It had rubber joints and tendons, and ten motors at the wrist with cable routing that the engineers said actually mimics the human carpal tunnel quite closely. Seeing the hand in motion as it manipulates a variety of objects with humanlike finesse is rather uncanny.

Still, the more common approach among engineers is to simplify the robot hand, often with just two or three fingers like sophisticated versions of claw machines at arcades. The RE2 Robotics’ Highly Dexterous Manipulation System shows that even a robot with two metal jaws for hands can make simple balloon animals and unwrap a gift box (with a human at the controls).

Other engineers are focusing their efforts on crafting soft robots that can more easily conform to the objects they’re trying to pick up.

“If you have less motors, you have less possible motions, but you can add softness to the hands to compensate,” says Maria Pozzi, a researcher at the University of Siena in Italy. Pozzi is working with mechanics researcher Monica Malvezzi and other colleagues on making a "synergistic" soft robotic hand as part of the European Union-funded project Soft Manipulation, or SoMa.

When you play piano, your fingers are moving independently of each other to hit the right keys. But when you grab something, your fingers are working together (or synergistically), closing in on the object “like a jellyfish,” says Domenico Prattichizzo, another SoMa researcher with the University of Siena. So instead of trying to replicate the micro motions of each joint, they’re trying to recreate that jellyfish-like macro motion we use for grabbing.

Good Hands Need Good Brains

The physical construction of the hand isn’t the only concern. To grasp an object, humans rely on a combination of senses — mostly sight and touch — to figure out where to place their fingers and how much force to use. You likely don’t think about it, but the way you grab a strawberry is much different than how you pick up a cup of coffee. If we want robots that can autonomously and spontaneously manipulate a variety of objects without human controllers, engineers need to equip robots with similar senses so they can perceive the exact shape, location, and weight of the things around them.

There are a few ways to do this. At Stefanie Tellex’s robotics lab at Brown University, engineers are training a humanoid bot named Baxter to scan unfamiliar objects with its cameras and infrared sensors and then attempt to pick them up with its two fingers. Baxter then shakes the objects around to find the right grasp. To get better at manipulating different objects, robots, like babies, might have to learn by trial and error.

But unlike babies, robots can easily share what they learn with bots that have the same programming. Through its Million Object Challenge, Tellex’s lab is trying to enlist the other Baxter robots to learn how to pick up a wide variety of objects and share that knowledge. The researchers hope to eventually compile a database of a million different objects, paired with the best gripping techniques for each. Google Research has also tried to use machine learning and a vision system to get robots to teach themselves improved hand-eye coordination.

Other roboticists have advocated for mimicking the sense of touch to improve robotic hands. Vincent Duchaine, co-founder of Robotiq and a researcher at the École de Technologie Supérieure in Montreal, has written that the future of robotic grasping will require "tactile intelligence." His lab has been working on sensors for robot fingertips that can detect subtle vibrations that might signal how an object is slipping from its grasp. When MIT engineers created a soft gripper with three caterpillar-like fingers, they gave their bot “bend sensors” that helped determine what object was being picked up — like an egg, a CD, or a tennis ball — based on its curvature.

Giving Big Business a Hand

If more robots can master the everyday task of grasping a variety of objects, the implications would be huge business. Companies with big warehouses to stock have been especially keen on finding solutions to this grasping problem for robots.

In the near future, the British online supermarket Ocado hopes to have robots handle the 48,000 different items they sell.

Ocado partnered with the SoMa project and is already at work on a robot arm with springy blue fingers that can grab limes and apples. Ocado also got involved in another European Union project, SecondHands, with the aim of building humanoid robots that can provide assistance to factory floor workers by 2020.

Related: These Robots Look Freaky But Can Do Amazing Things

Ever seeking to automate its practices, Amazon has also been searching for robots with the right stuff to shelve and bin objects in its gigantic warehouses.

The company has been trying to foster this innovation since 2015 with its Amazon Robotics Challenge, in which teams build bots that race to accurately pick and stow items like books, boxes, clothes and toiletries without damaging them, vying for a grand prize of $80,000.

It’s not exactly March Madness, but the stakes are still high. Last year’s winning team — a group from TU Delft Robotics Institute and the company Delft Robotics, both in the Netherlands — trained a bot to recognize 40 Amazon warehouse items. It managed to pick objects off the shelf at a speed of around 100 items per hour with a 16.7 percent failure rate, according to TechRepublic. For comparison, a human can move around 400 items an hour — so robots still have a ways to go.

Follow NBC MACH on Twitter, Facebook, and Instagram.