A machine that connects the unconscious mind to the vastly expanding world of big data could help humans navigate the explosion in data generated by a boom in computing technology.
Every minute, the world generates 1.7 million billion bytes of data – the equivalent of 360 000 DVDs – from sources such as sensors for climate information, satellite imagery, purchase transaction records, GPS signals and simulation tools.
For example, the Large Hadron Collider – which famously discovered the elusive Higgs particle – generates data from 600 million particle collisions per second. Meanwhile, the NASA Centre for Climate Simulation stores 37 petabytes of data – 37 quadrillion bytes, or 37 times as much as the data contained in US academic libraries.
As a result, the data sector is growing by 40 % every year, with important implications for storage and accessibility of the information. It also raises the question of how humans can make the most of all of this rich data without being overwhelmed.
The answer could come via two EU-funded projects – CEEDs and VELaSSCo – which have brought together teams of leading experts from key disciplines to answer this question.
Jonathan Freeman, Professor of Psychology at Goldsmiths, University of London, leads the four-year CEEDs project, which has created an ‘eXperience Induction Machine’ (XIM). The XIM helps humans navigate vast amounts of complex information, using virtual reality and wearable sensors that track brain waves, heart rate and other responses.
‘This is where the worlds of big data and visualisations converge.’
Abel Coll, International Centre for Numerical Methods in Engineering (CIMNE), Spain
‘It turns out that only a small subset of sensory input reaches conscious awareness, yet the remainder is still processed by the brain,’ said Prof. Freeman. ‘This subconscious processing is very good at detecting novel patterns and meaningful signals. By unlocking the power of the subconscious, CEEDs will make fundamental contributions to human experience.’
The XIM machine, which was developed in the lab of CEEDs’ scientific director Professor Paul Verschure in Barcelona, consists of an immersive room equipped with speakers, projectors, projection screens, pressure-sensitive floor tiles, infrared cameras and a microphone. Data visualisations are displayed on the screen and the person’s response is monitored through sensors embedded in a headset.
‘The system acknowledges when participants are getting fatigued or overloaded with information,’ said Prof. Freeman. ‘And it adapts accordingly. It either simplifies the visualisations so as to reduce the cognitive load, thus keeping the user less stressed and more able to focus. Or it will guide the person to areas of the data representation that are not as heavy in information.’
For example, CEEDs technology could be used to guide a school child around datasets of interest – say of planets and stars – using virtual reality visualisations on screens. Importantly, the child’s unconscious processes would be monitored using sensors, detecting when they are bored and excited, and tailoring the exploration accordingly.
Other possible applications for CEEDs include helping users inspect satellite imagery, oil prospecting, economics and historical research. The technology has already been used for two years at the Bergen-Belsen memorial site in Germany. Discussions are also ongoing with museums in the Netherlands, the UK and the United States ahead of the 2015 commemorations of the end of World War II. The amount of data we are producing is growing by 40% each year. (Click the image to pop up)
Prof. Freeman said the idea could be expanded to areas such as retail. ‘Imagine an online shoe store which has thousands of shoes. How can users make quicker and more objective choices? CEEDs offers quick feedback on what people like and don’t like, based on brainwave response, eye movement and arousal. These implicit reactions can inform choice.’
He said the CEEDs system could also be used for therapeutic purposes. ‘There is scope for real-time therapy, where physiological measures, such as heart rate, arousal and gaze, are measured during sessions with an empathic counsellor.’
Meanwhile, the VELaSSCo project is aiming to make big data more understandable and useful for scientists by converting simulation data into simple visualisations for use in research and industry.
Simulations help to imitate real-world processes – for example simulation of planes is used in the aerospace industry to improve aerodynamics. Similarly, they can be used to predict weather and pollution across cities, model the aerodynamic behaviour of a Formula One car or investigate biological processes.
Such models rely on complex datasets generated by simulation tools. As technology expands, this data is increasing at an exponential rate, running into billions of separate records. Data has to be computed and stored on multiple servers, raising issues about how to access and manipulate the information.
‘Consider a simulation run on a Formula One car,’ said Abel Coll, VELaSSCo project coordinator at the International Center for Numerical Methods in Engineering (CIMNE) in Barcelona, Spain. ‘There may be simulation results relating to the air pressure and velocities around certain parts of the car; these results come from physics – these are the big data.
‘Now we have the big data, we may want to visualise them using contour lines or different colours to see the pressure of the air which is in contact with the F1 car. The problem is how to store and access the data – how do we manage this big data when it is distributed in different machines?’
The VELaSSCo team, which includes experts in big data handling, advanced visualisation, engineering simulations and consultants from industrial sectors such as aerospace, is hoping to create a ‘simulation data analysis platform’ for the scientific and engineering community.
The platform will store, access and transform data to enable researchers to visualise the most detailed and up-to-date simulation results. It will include a database engine based on widely used technologies that organise and store a diverse range of large-scale simulation datasets for collaborative use.
‘We want to work out how best to take advantage of the big data to provide further visualisations of the simulation results in a user-friendly way,’ said Coll. ‘This is where the worlds of big data and visualisations converge.’
Augmented reality took the public by storm this summer, driving 100 million gamers to search for Pokémon hidden around their cities. But while the monster-chasing craze may fade, it won’t be the last you hear of the technology behind it as researchers look beyond the entertainment industry and see how it can be put to work to improve our daily lives.
Engineers at the Joint European Torus (JET) nuclear fusion experiment could be using augmented reality through Microsoft’s HoloLens technology to see where radiation hotspots are, according to Jonathan Naish, at the UK’s Culham Centre for Fusion Energy, who has developed an award-winning system to check exposure using virtual reality.
New ways to diagnose cancer are set to dramatically boost survival rates. ‘4D’ video imaging of the heart and electronic pills could do the same for heart disease and bowel problems.
The anti-tumour properties of chemotherapy drugs could work twice as well if you take them at times when the body is most receptive, but that means different timetables for different people, according to researchers working to understand how to use the body’s daily rhythms to make medicine optimally effective and reduce unpleasant side effects.
Pancreatic cancer survival rates could be improved by a ‘factor of 10’.
It could be used in everything from construction to museums.
The UN’s Quito Declaration puts citizens at the heart of city planning.