MIT researchers enables soft robotic arm to understand its configuration in 3D space using "sensorized" skin

Advertisement

MIT has announced that for the first time, its researchers have leveraged just motion and position data from the “sensorized” skin of a soft robotic arm to enable it to understand its configuration in 3D space.

MIT notes that soft robots made from highly compliant materials—similar to those found in living organisms—are being championed as safer, and more adaptable, resilient, and bioinspired alternatives to traditional rigid robots. Giving these deformable robots autonomous control is considered a “monumental task,” though, because at any given moment, they can move in a virtually infinite number of directions, which makes it hard to train planning and control models that drive automation.

Large systems of multiple motion-capture cameras are traditionally used to achieve autonomous control, MIT says. These cameras provide the robots feedback about 3D movement and positions, but these large systems are considered impractical for soft robots in real-world applications.

In a paper being published in the journal IEEE Robotics and Automation Letters, MIT researchers describe a system of soft sensors that cover a robot’s body to provide “proprioception,” which means awareness of motion and position of its body. That feedback runs into a novel deep-learning model that sifts through the noise and captures clear signals to estimate the robot’s 3D configuration. The system was validated on a soft robotic arm resembling an elephant trunk that can predict its own position as it autonomously swings around and extends.

Off-the-shelf materials can be used to fabricate the sensors, so any lab can develop their own systems according to Ryan Truby, a postdoc in the MIT Computer Science and Artificial Laboratory (CSAIL), and the co-first author on the paper along with CSAIL postdoc Cosimo Della Santina.

“We’re sensorizing soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication,” Truby explains. “We want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. This is a first step toward that type of more sophisticated automated control.”

A future goal is to help make artificial limbs that can more dexterously handle and manipulate objects in the environment.

“Think of your own body: You can close your eyes and reconstruct the world based on feedback from your skin. We want to design those same capabilities for soft robots,” explains co-author Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science.

Fully integrated body sensors have been a long-term goal in soft robotics, MIT notes, as traditional rigid sensors detract from a soft robot body’s natural compliance, complicate its design and fabrication, and can cause various mechanical failures. Soft-material-based sensors are considered a more suitable alternative, but they require specialized materials and methods for their design, which makes them difficult for many robotics labs to fabricate and integrate in soft robots.

Truby explains that one day while he was working in his CSAIL lab looking for inspiration for sensor materials, he made an interesting connection. 

“I found these sheets of conductive materials used for electromagnetic interference shielding, that you can buy anywhere in rolls,” Truby says.

These materials have “piezoresistive” properties, which means when they are strained, they change in electrical resistance. Truby realized that if placed on certain spots on the trunk, they could make effective soft sensors.

MIT explains that as the sensor deforms in response to the trunk’s stretching and compressing, its electrical resistance is converted to a specific output voltage, which is then used as a signal correlating to that movement.

The material didn’t stretch much, though, which would limit its use for soft robotics. Truby was inspired by a variation of origami that includes making cuts in a material called kirigami, which led him to designing and laser-cutting rectangular strips of conductive silicone sheets into various patterns, such as rows of tiny holes or crisscrossing slices like a chain link fence. Truby says this made them far more flexible, stretchable, “and beautiful to look at.”

The researchers’ robotic trunk is made up of three segments, each with four fluidic actuators (12 total) used to move the arm. They fused one sensor over each segment, with each sensor covering and gathering data from one embedded actuator in the soft robot. They used a technique called “plasma bonding” that energizes a surface of a material to make it bond to another material. It takes approximately a few hours to shape dozens of sensors that can be bonded to the soft robots using a handheld plasma-bonding device.

The sensors captured the trunk’s general movement, as hypothesized, but they were “really noisy,” MIT notes.

“Essentially, they’re nonideal sensors in many ways,” Truby says. “But that’s just a common fact of making sensors from soft conductive materials. Higher-performing and more reliable sensors require specialized tools that most robotics labs do not have.”

By sifting through the noise to capture meaningful feedback signals, the researchers built a deep neural network to do most of the heavy lifting to estimate the soft robot’s configuration using only the sensors. The researchers developed a new model to kinematically describe the soft robot’s shape that significantly lowers the number of variables needed for their model to process.

During experiments, the researchers had the trunk swing around and extend itself in random configurations for approximately 90 minutes. This traditional motion-capture system was used for ground truth data. During training, the model analyzed data from its sensors to predict a configuration, and compared its predictions to that ground truth data which was being collected at the same time. As a result, the model “learns” to map signal patterns from its sensors to real-world configurations. Results showed that for certain and steadier configurations, the robot’s estimated shape matched the ground truth.

The next goal for the researchers is to explore new sensor designs for improved sensitivity and to develop new models and deep-learning methods to reduce the required training for every new soft robot. Researchers also hope to refine the system to better capture the robot’s full dynamic motions.

The neural network and sensor skin are currently not sensitive to capture subtle motions or dynamic movements. Truby notes, though, that this is still considered an important first step for learning-based approaches to soft robotic control.

“Like our soft robots, living systems don’t have to be totally precise,” Truby says. “Humans are not precise machines, compared to our rigid robotic counterparts, and we do just fine.”