A machine that connects the unconscious mind to the vastly expanding world of big data could help humans navigate the explosion in data generated by a boom in computing technology.
Every minute, the world generates 1.7 million billion bytes of data – the equivalent of 360 000 DVDs – from sources such as sensors for climate information, satellite imagery, purchase transaction records, GPS signals and simulation tools.
For example, the Large Hadron Collider – which famously discovered the elusive Higgs particle – generates data from 600 million particle collisions per second. Meanwhile, the NASA Centre for Climate Simulation stores 37 petabytes of data – 37 quadrillion bytes, or 37 times as much as the data contained in US academic libraries.
As a result, the data sector is growing by 40 % every year, with important implications for storage and accessibility of the information. It also raises the question of how humans can make the most of all of this rich data without being overwhelmed.
The answer could come via two EU-funded projects – CEEDs and VELaSSCo – which have brought together teams of leading experts from key disciplines to answer this question.
Jonathan Freeman, Professor of Psychology at Goldsmiths, University of London, leads the four-year CEEDs project, which has created an ‘eXperience Induction Machine’ (XIM). The XIM helps humans navigate vast amounts of complex information, using virtual reality and wearable sensors that track brain waves, heart rate and other responses.
‘This is where the worlds of big data and visualisations converge.’
Abel Coll, International Centre for Numerical Methods in Engineering (CIMNE), Spain
‘It turns out that only a small subset of sensory input reaches conscious awareness, yet the remainder is still processed by the brain,’ said Prof. Freeman. ‘This subconscious processing is very good at detecting novel patterns and meaningful signals. By unlocking the power of the subconscious, CEEDs will make fundamental contributions to human experience.’
The XIM machine, which was developed in the lab of CEEDs’ scientific director Professor Paul Verschure in Barcelona, consists of an immersive room equipped with speakers, projectors, projection screens, pressure-sensitive floor tiles, infrared cameras and a microphone. Data visualisations are displayed on the screen and the person’s response is monitored through sensors embedded in a headset.
‘The system acknowledges when participants are getting fatigued or overloaded with information,’ said Prof. Freeman. ‘And it adapts accordingly. It either simplifies the visualisations so as to reduce the cognitive load, thus keeping the user less stressed and more able to focus. Or it will guide the person to areas of the data representation that are not as heavy in information.’
For example, CEEDs technology could be used to guide a school child around datasets of interest – say of planets and stars – using virtual reality visualisations on screens. Importantly, the child’s unconscious processes would be monitored using sensors, detecting when they are bored and excited, and tailoring the exploration accordingly.
Other possible applications for CEEDs include helping users inspect satellite imagery, oil prospecting, economics and historical research. The technology has already been used for two years at the Bergen-Belsen memorial site in Germany. Discussions are also ongoing with museums in the Netherlands, the UK and the United States ahead of the 2015 commemorations of the end of World War II. The amount of data we are producing is growing by 40% each year. (Click the image to pop up)
Prof. Freeman said the idea could be expanded to areas such as retail. ‘Imagine an online shoe store which has thousands of shoes. How can users make quicker and more objective choices? CEEDs offers quick feedback on what people like and don’t like, based on brainwave response, eye movement and arousal. These implicit reactions can inform choice.’
He said the CEEDs system could also be used for therapeutic purposes. ‘There is scope for real-time therapy, where physiological measures, such as heart rate, arousal and gaze, are measured during sessions with an empathic counsellor.’
Meanwhile, the VELaSSCo project is aiming to make big data more understandable and useful for scientists by converting simulation data into simple visualisations for use in research and industry.
Simulations help to imitate real-world processes – for example simulation of planes is used in the aerospace industry to improve aerodynamics. Similarly, they can be used to predict weather and pollution across cities, model the aerodynamic behaviour of a Formula One car or investigate biological processes.
Such models rely on complex datasets generated by simulation tools. As technology expands, this data is increasing at an exponential rate, running into billions of separate records. Data has to be computed and stored on multiple servers, raising issues about how to access and manipulate the information.
‘Consider a simulation run on a Formula One car,’ said Abel Coll, VELaSSCo project coordinator at the International Center for Numerical Methods in Engineering (CIMNE) in Barcelona, Spain. ‘There may be simulation results relating to the air pressure and velocities around certain parts of the car; these results come from physics – these are the big data.
‘Now we have the big data, we may want to visualise them using contour lines or different colours to see the pressure of the air which is in contact with the F1 car. The problem is how to store and access the data – how do we manage this big data when it is distributed in different machines?’
The VELaSSCo team, which includes experts in big data handling, advanced visualisation, engineering simulations and consultants from industrial sectors such as aerospace, is hoping to create a ‘simulation data analysis platform’ for the scientific and engineering community.
The platform will store, access and transform data to enable researchers to visualise the most detailed and up-to-date simulation results. It will include a database engine based on widely used technologies that organise and store a diverse range of large-scale simulation datasets for collaborative use.
‘We want to work out how best to take advantage of the big data to provide further visualisations of the simulation results in a user-friendly way,’ said Coll. ‘This is where the worlds of big data and visualisations converge.’
Imagine controlling your computer just by thinking. It sounds far-out, but real advances are happening on these so-called brain-computer interfaces. More researchers and companies are moving into the area. Yet major challenges remain, from user training to the reality of invasive brain implant procedures.
Artificial intelligence is growing ever more powerful and entering people’s daily lives, yet often we don’t know what goes on inside these systems. Their non-transparency could fuel practical problems, or even racism, which is why researchers increasingly want to open this ‘black box’ and make AI explainable.
In the summer of 2014 a strange building began to take shape just outside MoMA PS1, a contemporary art centre in New York City. It looked like someone had started building an igloo and then got carried away, so that the ice-white bricks rose into huge towers. It was a captivating sight, but the truly impressive thing about this building was not so much its looks but the fact that it had been grown.
Bilingual people can effortlessly switch between languages during everyday interactions. But beyond its usefulness in communication, being bilingual could affect how the brain works and enhance certain abilities. Studies into this could inform techniques for learning languages and other skills.
Live mycelium networks, capable of information processing, could be used as building materials.
Researchers are investigating whether bilingualism enhances certain cognitive abilities.
Dr Kate Rychert studies ocean plate structures.