Robots as Clever as an Octopus
Octopuses have nine brains, one donut-shaped one in their head plus a small one in each of their eight arms. They can use tools, solve puzzles, and recognize human faces.
Take Otto, an octopus that once lived in the SeaStar Aquarium in Germany. He was known to squirt water onto an annoying light above his tank, short-circuiting the lights in the entire building. Employees said he got bored when the aquarium was closed and would rearrange the tank, juggle hermit crabs, and throw rocks at the glass.
Tool use is relatively rare in animals and is typically associated with apes, monkeys, dolphins, and birds. Among invertebrates, only octopuses and a few insects are known to use tools.
No wonder then that octopuses are the envy of the underwater soft robotics world.
This week, Marine Technology Reporter covered how MBARI researchers have deployed a new imaging system to study the movement of deep-sea octopuses.
The researchers can use their EyeRIS camera system to track the movements of specific points on an octopus’s arm, identifying areas of curvature and strain in real time as the animal crawls over rugged seafloor terrain.
The aim is to use the 3D visual data collected to help design bioinspired robots.
Intelligent materials and control systems that mimic organic movement, such as those of an octopus, are critical to the success of soft robots which offer more agility, more task flexibility, and the ability to be gentler than rigid robots.
A multi-national study recently published in Engineering Science and Technology highlights that controlling soft robots is challenging due to the wide range of possible movements they can make. This requires complex problem solving and computing power.
Recent research has focused on integrating intelligent soft materials such as shape-memory polymers that can be 3D printed into complex structures and hydrogels that can be molded into soft actuators with tunable stiffness and responsiveness to external stimuli.
Existing control strategies struggle with changing hydrodynamic forces and local disturbances, and this limits maneuverability and accuracy. In response, researchers are developing advanced control systems that use real-time feedback from onboard sensors to dynamically adjust a robot’s behavior. This includes machine learning techniques such as neural networks and reinforcement learning.
The integration of inertial measurement units and depth sensors is being combined with adaptive control algorithms to enable complex missions to be undertaken with minimal human supervision.
Earlier this year, a team from the University of Bristol designed a simple yet smart robot which uses fluid flows of air or water to coordinate suction and movement similar to octopuses. The study, published in Science Robotics, showed how the soft robot can stick to things, sense its environment, and control its own actions accordingly.
The suction intelligence works at two levels: by coupling suction flow with local fluidic circuitry, the soft robot can achieve octopus-like low-level embodied intelligence, including gently grasping delicate objects, adaptive curling, and encapsulating objects of unknown geometries. By decoding the pressure response from a suction cup, the robot can achieve high-level perception including contact detection, classification of environment and surface roughness, as well as prediction of interactive pulling force.
The researchers are now working on making the system smaller and more robust for real-world use. They also aim to combine it with smart materials and AI to improve its adaptability and decision-making in complex environments.