Saturday, December 07, 2013

Soft Robotics in Space

Space robotics is understandably conservative. When the cost of putting a robot on a planet, moon or asteroid runs into billions we need to be sure the technology will work. And with very long project lifetimes - spanning decades from engineering design to on-planet robot exploration - it's a long hard road from the research lab to the real off-world use for new advances in robotics.

This context was very much in mind when I gave a talk on Advanced Robotics for Space at the Appleton Space Conference last week. I used this great opportunity to outline a few examples of new research directions in robotics for the European space community, and suggest how these could benefit future planetary robots. I had just 20 minutes, so I couldn't do much more than show a few video clips. The four new directions I highlighted are:
  1. Soft Robotics: soft actuation and soft sensing
  2. Robots with Internal Models, for self-repair
  3. Self-assembling swarm robots, for adaptive/evolvable morphology
  4. Autonomous 3D collective robot construction
In this post I want to talk about just the first of these: soft robotics, and why I think we should seriously think about soft robotics in space. Soft robotics - as the name implies - is concerned with making robots soft and compliant. It's a new discipline which already has its own journal, but not yet a wikipedia page. Soft robots would be soft on the inside as well as the outside - so even the fur covered Paro robot is not a Soft robot. Soft robotics research is about developing new soft, smart materials for both actuation and sensing (ideally within the same material). Soft robots would have the huge advantage over conventional stiff metal and plastic robots, of being light and, well, soft. For robots designed to interact with humans that's obviously a huge advantage because it makes the robot intrinsically much safer. 

Soft robotics research is still at the exploratory stage, so there are not yet prefered materials and approaches. In our lab we are exploring several avenues, one is electroactive polymers (EAPs) for artificial muscles; another is the bio-mimetic 3D printed flexible artificial whisker. Another approach makes use of shape memory alloys to actuate octopus like limbs: here is a very nice YouTube movie from the EU OCTOPUS project. And perhaps one of the most unlikely but very promising approaches: exploiting fluid-solid phase changes in ground coffee to make a soft gripper: the Jaeger-Lipson coffee balloon gripper.

Let me elaborate a little more on the coffee balloon gripper. Based on the simple observation that when you buy vacuum-packed ground coffee the pack is completely solid, yet as soon as you cut open the pack and release the vacuum the ground coffee returns to its flowing fluid state. Heinrich Jaeger, Hod Lipson and co-workers put ground coffee into a latex balloon then, by controlling the vacuum via a pump, they demonstrate a gripper able to safely pick up and hold more or less any object. Here is a YouTube video showing this remarkable ability.

Almost any planetary exploration robot is likely to need a gripper to pick up or collect rock samples for analysis or collection (for return to Earth). Conventional robot grippers are complex mechanical devices that need very precise control in order to reliably pick up irregularly shaped and sized objects. That control is mechanically and computationally expensive, and problematical because of time delays if it has to be performed remotely from Earth. Something like the Jaeger-Lipson coffee balloon gripper would - I think - provide a much better solution. This soft gripper avoids the hard control and computation because the soft material adapts itself to the thing it is gripping; it's a great example of what we call morphological computation.

The second example I suggested is inspired by work in our lab on bio-inspired touch sensing. Colleagues have developed a device called TACTIP - a soft flexible touch sensor which provides robots (or robot fingers) with very sensitive touch sensing capable of sensing both shape and texture. Importantly the sensing is done inside TACTIP, so the outside surface of the sensor can sustain damage without loss of sensing. Here is a very nice YouTube report on the TACTIP project

It's easy to see that giving planetary robots touch sensing could be useful, but there's another possibility I outlined: the potential to allow Earth scientists to feel what the robot's sensor is feeling. PhD student Callum Roke and his co-workers developed a system based on TACTIP for what we call remote tele-haptics. Here is a video clip demonstrating the idea:



Imagine being able to run your fingers across the surface of Mars, or directly feel the texture of a piece of asteroid rock without actually being there.