Nimble-fingered robot has 99% success rate grasping irregular shapes

A nimble-fingered robot, with a 99% success rate in picking up awkward objects that humans grasp every day, and moving them, has been developed by roboticists at UC Berkeley. This robot, unlike any other before it, is able to grab irregularly-shaped items such as spray bottles, rubber duckies, shoes, and open boxes virtually as easily and nimbly as human hands can.

For decades scientists have struggled to create a truly nimble-fingered robot that can grab and pick up awkward objects, and apply just the right amount pressure instinctively.

Ken Goldberg, a Berkeley professor, Jeff Mahler, a researcher at Berkeley’s Laboratory for Automation Science and Engineering (AUTOLAB), and colleagues created DexNet 2.0 – a robot with an incredibly-high grasping success rate. DexNet stands for Dexterity Network.

Nimble-Fingered Robot - DexNet 2Prof. Ken Goldberg (left) and Siemens Research Group Head, Juan Aparicio, analyze DexNet’s ability to perform grasping motions downloaded from the cloud. (Image: siemens.com/innovation)

Nimble-fingered robot could revolutionize manufacturing

The roboticists believe that the technology they have used in developing DexNet 2.0 could soon be applied in industry. These ultra-dexterous robot hands have the potential to revolutionize manufacturing as well as the supply chain.

DexNet 2.0 has become an amazingly nimble-fingered robot thanks to deep learning. The scientists built an enormous database of 3-dimensional shapes – a total of 6.7 million data points – that a neural network uses to learn grasps that will move objects and pick up things with irregular shapes.



They then connected the neural network to a 3D sensor and a robotic arm. When any object is placed in front of DexNet 2.0, it rapidly examines the shape and chooses a grasp that will pick it up and move it successfully 99% of the time.

According to Goldberg and Mahler, DexNet 2.0 is three times faster than its previous version.

Nimble-fingered robot picking up objects of different sizesPicking up or grasping objects of different weights, sizes and shapes is a complicated undertaking for a robot. In the cloud, computations are parallelized and efficiently executed, in order to discover the best grasping motions. (Image: siemens.com/content)

In the latest issue of MIT Technology Review, DexNet 2.0 was featured as a cover story. According to the Review, DexNet 2.0 is “the most nimble-fingered robot yet.”

Will Knight, writing in the MIT Technology Review, says this latest development shows how new approaches to robot learning, together with the ability for robots to access data from the cloud, could advance their capabilities in warehouses and factories, and might even enable them to do useful work in homes and hospitals.



Their work will be published in July at a major robotics conferenc. The article is currently filed in arXiv – a repository of electronic preprints, known as e-prints, of scientific papers – titled Dex-Net 2.0: Deep Learning to Plan Robust Grasps with Synthetic Point Clouds and Analytic Grasp Metrics.

The UC Berkeley researchers collaborated with Juan L. Aparicio, who heads a Siemens research team in advanced manufacturing automation, located in Berkeley. Siemens is interested in commercializing cloud robotics, as well as other connected manufacturing technologies.

Aparicio says this nimble-fingered robot is exciting because the reliability of its arm offers a clear path toward commercialization.

In an article in the Siemens website, titled A Cloud for Robotics, Aparicio said:

“Today, robots just have to pick and place the same object over and over. But in the near future, as we move toward lot-size-one, it will be unworkable to reprogram all of a robot’s movements for each new object.”

“And in order to figure out how to grasp new objects on their own, robots will need the cloud.”

Video – DexNet 2.0: A nimble-fingered robot

This CITRIS video talks about DexNet 2.0, the most dexterous robot ever created.

Leave A Reply

Your email address will not be published.