Soft robots could open fundamentally new capabilities and applications for robots by freeing them from the many restrictions that result from having a traditional rigid body. However, a major challenge for autonomous soft robots is figuring out how to actually control them. First, being highly deformable and compliant means a soft robotic limb can be in an almost unlimited number of potential positions at any given time. It’s difficult to model soft robots’ behaviour. Second, soft robots can rarely work out their three-dimensional positioning or configuration during operation due to the challenges of introducing appropriate sensors for motion or position feedback.
Rigid sensors or motion-capture imaging techniques are often used for helping soft robots sense their configuration, but these are impractical for real-world soft robotic applications and/or limit the “softness” of the system.
Research from 2018 Fellow Dr. Ryan Truby, and colleagues at MIT CSAIL provides a step towards overcoming these obstacles. The MIT team has used soft sensors to create a ‘sensorized skin’ around an elephant trunk-inspired soft robotic limb. This provides the robot with feedback on motion and position, similar to the way proprioceptors in the human body do. The signals are fed through a deep-learning model which provides an estimate of the robot’s three-dimensional positioning.
“Soft robots have the potential to revolutionize how and where we can deploy robots, but there are still some major materials, robotics, and operational challenges to overcome. Our new framework, which introduces a new sensorization strategy and deep learning approach to soft robot proprioception, represents an important step towards new soft robots that can orient and control themselves.” – Ryan Truby, 2018 Fellow.