Quantum computing could revolutionise science by offering a new way of solving complex problems beyond the scope of standard transistor-based computers.
Computer scientists, mathematicians and physicists hope that by harnessing the world of quantum mechanics they can make a major step forward in computing power.
The citation for the 2012 Nobel Prize for Physics pointed to quantum computing possibly holding out the prospect of transforming our lives in this century as much as ordinary computers did in the last.
The aim is to make previously insurmountable problems more manageable. ‘The great attraction is that quantum computers would be able to solve certain problems much faster than classical computers,’ said Prof. Andris Ambainis from the University of Latvia.
‘It’s a radically new way of computing, and it’s the only new way of computing that provides speed-ups by orders of magnitude,’ said Prof. Ambainis, who coordinates the Quantum Computer Science (QCS) consortium, an EU-funded collaboration of eight research teams working on quantum computing problems.
The researchers in the QCS consortium are working to identify the kinds of problems best suited to quantum handling, and how to tackle them. Among their fields of focus are sophisticated methods of quantum cryptography – ways to ensure that secret communication really is safe. The teams are also looking at new methods for quantum algorithms, to ensure the radically different computing technology solves problems in the most efficient way to deliver the massive increases in speed it promises.
Prof. Ambainis said that given the physical sensitivity of many of the processes involved in quantum computing, it is clear that quantum computers would not suit every situation. And it could be that they will remain the size of a room, though their power keeps being increased.
‘The great attraction is that quantum computers would be able to solve certain problems much faster than classical computers.’
Prof. Andris Ambainis, University of Latvia
‘The hope is that we would add more qubits (quantum pieces of information). We would have devices that are of the same size as their current implementations, or perhaps somewhat smaller, but that there would be more qubits and more computational power,’ Prof. Ambainis said.
The next steps forward for the hardware will depend on which of the possible technologies proves most feasible for building the devices and then scaling them up to sizes where they can take on processing power that overtakes the number-crunching abilities of classical computers.
Prof. Ambainis believes it could take five to 10 years to develop quantum computers that could simulate quantum mechanical processes, allowing physicists to discover more about the fundamentals that cannot be studied using current approaches.
It could take a couple more decades to reach the stage where quantum computers harness enough qubits to perform other significant mathematical tasks.
Like classical digital computing, quantum computing is a joint effort of hardware and software, but its way of working differs fundamentally.
Ordinary computers process information by switching electric current on (1) or off (0), using semiconductor switches known as transistors to make sequences of binary digits or ‘bits’. The switches are combined to perform operations of basic logic, which are in turn harnessed together to carry out the necessary computation, such as finding the factors of a large number.
In quantum computing, the 0 and 1 are encoded into what is known as a quantum state. Manipulating that quantum state then performs the computation.
1 and 0 simultaneously
An atom can simultaneously have different energy states, or a photon of light may have multiple polarisations. These energy states or polarisations can also encode 0 and 1. So, with quantum computers the switch does not change from on to off, but between different quantum states. In addition to 0 and 1, quantum systems can be in a number of intermediate states. That means that a qubit, can in fact encode 1 and 0 simultaneously.
Combining qubits means they can be superimposed on one another, making all of the possible states work together.
Then, if an operation is performed on one part of the superposition, it affects the whole arrangement, effectively making a processor that increases exponentially in power with the number of qubits being combined.
Professor Andris Ambainis from the University of Latvia. © Toms Grīnbergs, LU Press centerPhysicists around the world are working on several different technologies for developing qubits, and the 2012 Nobel Prize in Physics went to two researchers whose work allows measurement and manipulation of individual photons or ions at the quantum level.
Systems under consideration include lattices of atoms arranged with lasers; the interactions of photons trapped in cavities where atoms are passed through; and nuclear spin manipulated with magnetic fields, among other possibilities.
However, whatever the combination of charged atoms, lasers or magnetic fields, the challenge is in scaling up without destroying essential quantum attributes.
Quantum computers could take several decades to achieve the necessary levels of sophistication to operate as general, universal computers, rather than the specialised quantum systems already on the market. Even then, the approach will not be appropriate for all situations and many tasks will still rely on classical computing, though with incremental improvements.
The technological challenges are undeniable in building the quantum computers of the future, but the computer scientists and mathematicians are determined to work out how best to employ them, since their way of working is so different from classical methods.
Research on algorithms for quantum computers by members of the QCS consortium is also resulting in new ways of analysing existing technology – such as conventional computing – and how it can be improved. Spin-offs of this process are likely to find their way into the device on your desk or in your hand.
The big data explosion, which allows scientists to analyse factors such as people’s lifestyles, genes and medical records to develop personalised treatments for conditions, has so far mostly benefitted rare diseases with simple causes. But now, complex problems such as cardiovascular disease and dementia are getting the big data treatment.
An analysis of a newly cleaned-up dataset tracking Europe’s air pollution has revealed that nitrogen dioxide levels are on a steeper downward trend than previously thought, according to Dr Folkert Boersma from the Royal Netherlands Meteorological Institute, who says that ensuring the quality of Earth observation data can reveal new insights into climate change.
The pioneering solar flight foundation Solar Impulse has launched an ‘Efficient Solution’ label for clean energy start-ups and innovations that can demonstrate their profitability, in a bid to boost investment in the sector.
Profitability, not altruism, set to spur investors.
Scientists are working out how to improve the quality of urban environments.
Co-author of Stephen Hawking's final paper talks about how their work goes beyond Einstein.