DESIGN & PRODUCTS FLEXIBLE ELECTRONICS
Kirigami-laden soft robot
gets AI-driven proprioception
SBy Julien Happich oft robots constructed from highly compliant materials
are seen as potentially safer, more adaptable and more
resilient than today’s rigid robots. But accurate control
feedback loops prove difficult to implement for such deformable
bio-inspired robots due to their infinite degrees of freedom.
And the use of cumbersome multiple motion-capture cameras
to provide precise information about the robot’s 3D movement
and positions somehow defeats the purpose of designing a soft
robot.
Now, researchers from MIT have leveraged Artificial Intelligence
(AI) algorithms to analyse the output of flexible kirigamishaped
sensors integrated to the skin of a soft robot trunk, and
enable proprioception, the ability for the soft robot to “feel” how
it is twisted or bent and understand its own position in space.
Described in a paper published in the journal IEEE Robotics
and Automation Letters, the skin sensors consist of sheets
of conductive materials used for electromagnetic interference
shielding, which the researchers hollowed out or cut into precise
kirigami patterns that make the sheets much more flexible
and stretchable.
A soft robot connected to fluidic actuators sports a
“sensorized” skin made with kirigami-shaped sensors for its
proprioception. Ryan L. Truby, MIT CSAIL.
Because of their piezoresistive properties (varying their
electrical resistance when strained), these materials turn out to
make effective soft sensors as they deform in response to the
trunk’s stretching and compressing. Electrical resistance of the
sensors is converted to a specific output voltage, which is then
fed into a novel deep-learning model that sifts through the noise
and captures clear signals to estimate the robot’s 3D configuration,
correlated to real movement data captured with a motioncapture
system for ground truth data.
The researchers validated their system on a soft robotic arm
resembling an elephant trunk, that can predict its own position
as it autonomously swings around and extends.
“We’re sensorizing soft robots to get feedback for control
from sensors, not vision systems, using a very easy, rapid
method for fabrication,” explains Ryan Truby, a postdoc in the
MIT Computer Science and Artificial Laboratory (CSAIL) who is
co-first author on the paper along with CSAIL postdoc Cosimo
Della Santina.
“We want to use these soft robotic trunks, for instance, to
orient and control themselves automatically, to pick things up
and interact with the world. This is a first step toward that type
of more sophisticated automated control.”
The soft sensors are conductive silicone sheets cut into
kirigami patterns with piezoresistive properties. Ryan L. Truby,
MIT CSAIL
One future aim is to help make artificial limbs that can more
dexterously handle and manipulate objects in the environment.
“Think of your own body: You can close your eyes and
reconstruct the world based on feedback from your skin,” says
co-author Daniela Rus, director of CSAIL and the Andrew and
Erna Viterbi Professor of Electrical Engineering and Computer
Science. “We want to design those same capabilities for soft
robots.”
The researchers’ robotic trunk comprises three segments,
each with four fluidic actuators used to move the arm. They
fused one sensor over each segment, with each sensor covering
and gathering data from one embedded actuator in the
soft robot. To estimate the soft robot’s configuration using only
the sensors, the researchers built a deep neural network to do
most of the heavy lifting. They also developed a new model to
kinematically describe the soft robot’s shape that vastly reduces
the number of variables needed for their model to process. In
training, the model analyzed data from its sensors to predict a
configuration, and compared its predictions to the ground truth
data collected simultaneously by the motion-capture system. In
doing so, the model “learned” to map signal patterns from its
sensors to real-world configurations, matching the robot’s true
position.
24 News March 2020 @eeNewsEurope www.eenewseurope.com
/eenewseurope
/www.eenewseurope.com