Teams of robotic fish are drawing on the intelligence of swarms of social insects and other organisms in new ways to help protect the environment.
The group cognition of insects like fireflies and honeybees, and even organisms such as slime moulds, provide useful models for researchers developing autonomous systems. Teams of underwater vehicles can mimic these natural behaviours to monitor water pollution or search for debris on the seabed.
The EU-funded SHOAL project has developed robots inspired by fish, but operating like ants. These robots analyse the waters they swim through, identifying chemical pollutants or leaks from oil pipelines in European harbours.
As they move around in the water, teams of robots build up a map of their surroundings and work together to patrol the port.
‘These robotic fish allow for constant pollution monitoring, so if an incident occurs, such as a leak or spill in a harbour, we can take action immediately,’ said Luke Speller, a senior research scientist at British-based technology consultancy BMT Group and coordinator of the SHOAL project.
‘Compared to current measurement techniques of divers collecting samples and sending them for laboratory testing, the fish can give a much quicker response to environmental incidents in the port,’ he added.
The SHOAL robots swim using a tailfin rather than a propeller, minimising noise and disruption to marine life. The fish design allows the robots to manoeuvre easily, patrol in shallow waters, and avoid snags that might snarl propellers.
They use sonar to detect obstacles and map their surroundings, and are also equipped with acoustic localisation, gyroscopes, accelerometers and other sensors for navigation. Underwater acoustic communication is also used to share information between robots and the shore. On-board chemical sensors measure pollution and general water quality parameters such as salinity and oxygen concentration.
Monitoring or search
When monitoring, the robot fish spread out to maximise the coverage area, but patrol all areas regularly. Once a member of the ‘shoal’ detects a possible problem, the system begins a search.
‘Each of the robots is programmed with the same behavioural characteristics. To ensure they act differently, with different goals, each robot shares a small amount of information, such as where it has been and its current readings. If a pollution incident is detected then the robots will switch to a searching behaviour to find and identify the cause and origin of the pollutant,’ Speller said.
The robotic fish interact with each other and with a base station, which can take direct control if human intervention is required, such as if pollution is expected or an incident has occured in the port.
The project finished last year, and the researchers are now looking into whether robotic fish could play a role in coral reef monitoring, hydrographic mapping, and even barnacle counting.
Real-world environmental monitoring is also a focus of the EU-funded CoCoRo underwater robotic swarm research. This project is studying the collective cognitive capabilities that emerge from the dynamic interactions of simple individuals.
CoCoRo’s Lily robot is based on refitted turtle-shaped submarine-type toys and small blinking blue lights are one of their means of communication. Image courtesy of CoCoRo
‘We want to keep our individual robots as simple as possible and design algorithms that can help us to maximise their collective intelligence,’ said project coordinator Dr Thomas Schmickl, from the Artificial Life Laboratory at Karl-Franzens University in Graz, Austria.
‘We want to demonstrate that even such simple individuals can make rather intelligent and complex choices in the group,’ he added.
Key to the CoCoRo project is the potential to scale up from simple individuals to large groups, perhaps hundreds of robots. These could be used for environmental and ecological monitoring, such as plume detection of spills or toxic waste, as well as in exploration or search and rescue.
CoCoRo’s Lily robot platform has achieved the largest autonomous underwater robot swarm yet made, with 22 individuals. Based on refitted turtle-shaped submarine-type toys, the robots have limited computer-processing power but are cheap and easy to assemble.
The group is able to use a set of criteria to find the best target among several and it is also able to estimate its swarm size without using identity numbers, keeping down the demands for heavy number crunching.
‘It is a totally different approach, to develop mechanisms that are used in the animal kingdom, where there are no identity numbers and there is no Internet Protocol (IP) or similar system for coordinating huge groups of animals. But still they organise themselves very well and make collective decisions and coordinate,’ Dr Schmickl said.
‘With these fish, we can have constant pollution monitoring.’
Luke Speller, the coordinator of SHOAL
The team is developing the Lily results into a more advanced robot, called Jeff, with a more torpedo-like form, though it is not intended to mimic a fish.
Other EU-funded projects to explore the use of robots in coordinated teams include TRIDENT, led by Spain’s Universitat Jaume I. It developed a system for an underwater robot working in close cooperation with a surface vehicle robot. The set-up guides the robots in formation to survey the sea floor, using a system to find and manipulate or recover items such as ‘black box’ flight recorders.
The CO3-AUVs project also worked on development, implementation and testing of advanced cognitive systems for coordination and cooperative control of multiple autonomous underwater vehicles. The project, coordinated by Jacobs University Bremen, in Germany, focused on systems to explore uncharted territory, as well as monitor underwater structures and carry out harbour safety and security missions.
Grouped together, simple fish robots have the capacity for a surprising level of cognition, and by the time the CoCoRo project ends next year, coordinator Dr Thomas Schmickl wants the robots to pass the mirror test, a classic test for self-awareness.
In the mirror test, developed in 1970 by US psychologist Gordon Gallup Jr, an animal is surreptitiously marked with dye in a place that would only be visible using a mirror. Investigators then watch to see if it tries to remove the dye, thereby proving that it has recognised itself.
‘The test can show there is an emergent cognition appearing in these systems that does not exist in the individual. It can also show that self-consciousness is not a prerequisite for passing the mirror test,’ Dr Schmickl said.
While a number of robots have been shown to recognise themselves in a mirror, this is the result of sophisticated spatial recognition software rather than a sign they are self-aware, and no robot has ever passed the mirror test.
‘(If) a very, very simple robot form, with a very, very simple programme, could pass that test,’ Dr Schmickl said. ‘This could cast a shadow over the mirror test itself.’
Engineers at the Joint European Torus (JET) nuclear fusion experiment could be using augmented reality through Microsoft’s HoloLens technology to see where radiation hotspots are, according to Jonathan Naish, at the UK’s Culham Centre for Fusion Energy, who has developed an award-winning system to check exposure using virtual reality.
Measuring energy fluctuations in the nucleus of a rare radioactive element could improve the accuracy of GPS from metres to centimetres, while marbled volcanic magma is being used to create eruption countdowns, thanks to groups of European researchers who are pushing the boundaries of timekeeping.
Geochemical stopwatches may also predict volcano eruptions.
SOFT Prize winner Jonathan Naish has developed a virtual reality system.