Robot finger has precise sense of touch

Pedro Piacenza / Columbia Engineering
Tactile fingers progressing through its manufacturing stages: 3D-printed skeleton, flexible circuit board, transparent silicone layer, and reflective skin. (Pedro Piacenza, Columbia Engineering)

Touch sensors have been difficult to integrate into robot fingers for a number of reasons, but Columbia University researchers believe they have developed a robotic finger with a highly precise sense of touch over a complex, multicurved surface.

Current technology for designing robotic fingers with touch sensors have faced multiple obstacles. These include the ability to cover multicurved surfaces, high wire count, or difficulty fitting into small fingertips, thus preventing use in dexterous hands. To overcome these issues, the Columbia researchers used another tack: using overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger.

"There has long been a gap between stand-alone tactile sensors and fully integrated tactile fingers—tactile sensing is still far from ubiquitous in robotic manipulation," said Matei Ciocarlie, associate professor in the departments of mechanical engineering and computer science. Ciocarlie led this work in collaboration with Electrical Engineering Professor Ioannis (John) Kymissis. Their research was published online in IEEE/ASME Transactions on Mechatronics, demonstrating the two aspects of the underlying technology that combine to enable the new results.

Sponsored by Digi-Key

Industrial Ethernet Solutions from ADI Chronous™ Available Now from Digi-Key

ADI’s Chronous portfolio of edge-to-enterprise Industrial Ethernet connectivity solutions is designed to accelerate your path to Industry 4.0. The compatible and interoperable Industrial connectivity products enable best-in-class performance.

By measuring light transport between every emitter and receiver, the researchers demonstrated the ability to obtain a very rich signal data set that changes in response to the finger deforming due to touch. Then, they demonstrated that purely data-driven deep learning methods can extract useful information from the data, including contact location and applied normal force, without the need for analytical models.

Their final result is a fully integrated, sensorized robot finger, with a low wire count, built using accessible manufacturing methods and designed for easy integration into dexterous hands.

In the project, the researchers use light to sense touch. Under the "skin," their finger has a layer made of transparent silicone, into which they shined light from more than 30 LEDs. The finger also has more than 30 photodiodes that measure how the light bounces around. Whenever the finger touches something, its skin deforms, causing light to shift around in the transparent layer underneath.

Measuring how much light goes from every LED to every diode, the researchers ended up with close to 1,000 signals, each containing some information about the contact that was made. Since light can also bounce around in a curved space, these signals can cover a complex 3D shape such as a fingertip.

The team also designed this data to be processed by machine learning algorithms. The partially overlapping signals produced data too complex to be interpreted by humans. Current machine learning techniques can learn to extract the information that researchers care about: where the finger is being touched, what is touching the finger, how much force is being applied, etc.

"Our results show that a deep neural network can extract this information with very high accuracy," said Kymissis. "Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms."

The team built the finger so it, and others, can be put onto robotic hands. Integrating the system onto a hand is easy. The technology enables the finger to collect almost 1,000 signals, but it only needs a 14-wire cable connecting it to the hand and requires no complex off-board electronics. The researchers already have two dexterous hands (capable of grasping and manipulating objects) in their lab being outfitted with these fingers—one hand has three fingers, and the other has four. In the next months, the team will be using these hands to try and demonstrate dexterous manipulation abilities, based on tactile and proprioceptive data.

"Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics and is one of the technologies that, in the longer term, is needed to enable personal robotic assistance in other areas, such as healthcare or service domains," Ciocarlie added.

Suggested Articles

Artificial Intelligence can be used to speed up ingestion and processing of data in supercomputers, according to Nvidia

Advances in CMOS sensors are making the technology a key source of data in applications ranging from smart phones to self-driving cars.

Vehicles to ventilators: Scaling from a small number to many thousands of ventilators per week is a challenge, along with finding the right parts