Dreaming robots can classify information they have learnt and work out the best way to solve problems they faced during that day – and it’s helping them to learn more like we do.
A good night’s sleep is about more than just getting rest – while we dream, our unconscious minds process events from the day. That could mean putting an emotional experience into the context of past memories, or perhaps imagining a solution to something which puzzled us during the day.
But what about robots? While they can’t fantasize about flying around the neighbourhood, a group of scientists is working to give them the same ability to make sense of observations during their nightly down-time.
At the EU-funded RobDREAM project in Augsburg, Germany, scientists from industrial robot-maker KUKA and other partners are working to give robots ‘dreaming’ powers. In other words, they plan to use robots’ processing power while they sit idle to sift through all of the data they collect during their work hours.
The team is working with mobile industrial robots that are capable of operating side-by-side with humans. That daily interaction yields a lot of data. Robots store high-definition visual imagery from the systems they use to navigate around the factory floor and handle objects.
They also collect data about their own movements with laser sensors and record how they move a seven-jointed arm, noting the arm’s position and how much force is applied. When all the data is collected, the robots are ready for a snooze.
‘The dreaming aspect itself is that we want to use all this collected data in a simulation environment, where we can simulate the behaviour of the algorithms used in the robots’ systems overnight,’ project coordinator Dr Daniel Braun, from KUKA Roboter GmbH, said.
Such algorithms are rules used by robots to plan how they travel around the factory floor or move their arm to grasp a tool. ‘It’s basically the same as dreaming in humans – you make some experiences during the day … and overnight it’s somehow processed in simulation.’
Working alongside humans means that the robots used in RobDREAM are already capable of making autonomous changes to their daily routines. They have processing power which is roughly three to four times greater than a gaming PC, so if a worker puts an object down that blocks its path, the robot can use an algorithm to avoid a collision. But the solution they come up with in the moment may not be the most efficient one.
Take a robot arm which can move at seven different points. ‘The angles you can use are infinite,’ Dr Braun said. ‘The problem is you only get one solution for the generation of a path to take. A colleague of mine did some computations and if you wanted to search (every solution) it would take a few billion years.’
‘It’s basically the same as dreaming in humans.’
Dr Daniel Braun, KUKA Roboter GmbH, Germany
What scientists can do with dreaming is give the robots some suggestions of alternative rules and parameters, which they can then test out overnight. That should lead to better and more efficient paths, or movements, the next day.
The team is looking at using cloud computing to amplify the robots’ processing power and allow multiple robots to share their experiences. It also helps to manage vast amounts of recorded data.
‘When they come to their task the first time, it’s a new situation and they have to cope,’ Dr Braun said. ‘If you have two or three robots who are all encountering the same situation, it’s new for them all the time. When they combine their data, one robot can learn from the experience of another robot.’
Another group of scientists from the UK, France, Netherlands, and Spain is working on going beyond optimisation, giving robots the ability to learn more quickly and collaboratively through the use of dreaming.
The DREAM project, based in Paris, France, is working with multiple manufacturers’ robots, from Rethink Robotics’ humanoid Baxter with an LCD-face, to CrustCrawler’s articulated robotic arm. The scientists’ aim is to come up with a system that allows different robots to sift through their experiences during the day and categorise it in a more abstract way during sleep, enabling them to learn how to interact with their environment on their own.
Unlike the industrial robots at KUKA, these robots begin as a blank slate. The robots use their sensors and effectors – the parts used to interact with objects, like limbs – to observe their environment. They stumble around until they accidentally come upon something interesting, much in the way a toddler might.
‘The idea that we explore from this project is to start from the lowest level possible,’ project coordinator Professor Stéphane Doncieux from the Université Pierre et Marie Curie explained. ‘At the beginning it will be no more than touching an object and noticing that a pen on a table is different from the table, for instance.’
After the robot finds something interesting, it can gather lots of information about its texture, shape, colour, and other properties. Once it has that data it can try to find further ways to interact with objects, like grasping the pen.
‘The goal is to learn it automatically, not to provide the robot with this knowledge,’ Prof. Doncieux said.
That is where dreaming comes in. ‘During the day we acquire a lot of information, and while we sleep we consolidate this knowledge and change the way it is represented in our brain,’ said Prof. Doncieux.
In the EU-funded project, the robots will be able to categorise their experiences in a similar way during their downtime. After exploring a pen, for instance, the robot could identify the data points it learned about that object’s properties, and classify them for future encounters.
Over time, as robots learn about different objects, they create a library of experiences that could even be used to educate other robots.
‘What this kind of project suggests is to have some kind of schools for robots when they are built so they can learn,’ Prof. Doncieux said. ‘If we build a higher representation of what a pen is, for instance, if it is abstract enough, we can exchange it with robots of different kinds.’
Robots that learn things on their own have an advantage in that they form representations and concepts that are adapted to their own abilities, such as their different limbs or ways of sensing.
Both projects have the goal of getting robots out of metal cages to take a more collaborative and mobile role alongside human co-workers. By using dreaming to plan and understand their world better, that should also make robots more efficient as well as safer. Though that may mean reminding human workers that machines are not infallible.
‘What we usually experience is not fear that the robot is dangerous, it is the other way around – that people think that this robot is not dangerous at all,’ Dr Braun said. ‘You still have to make sure that people don’t do crazy things because they think the machine takes care of anything.’
The first full-length mainstream music album co-written with the help of artificial intelligence (AI) was released on 12 January and experts believe that the science behind it could lead to a whole new style of music composition.
Glasses that translate images of physical objects into soundscapes and a belt that turns images into vibrations are helping blind people build up a real-time 3D picture of the world around them, and the technology could hit the market as soon as next year.
Studying environments that are similar to Mars, and their microbial ecosystems, could help prepare biologists to identify traces of life in outer space.
For many people who struggle to get a good night’s rest, being able to switch on and off the brain circuits that control sleep would be a life-changer. The good news is that’s exactly what scientists hope to do, but first they need to get a better understanding of what’s going on.
Extremophile bacteria have adapted to survive inhospitable niches.
A better understanding of sleep pressure could advance therapies.
Sleep expert says that around 10 % of people are at risk of insomnia and employers should invest in therapy for those affected.