The Quantum Computing Fact Sheet
Or: the 10 (+4) things you cannot get wrong about quantum computers
Research in quantum computing is coming out of the labs. Because of its commercial and national security significance the advent of functioning quantum computers is subject to intense news coverage.
As quantum computing becomes part of the public discourse and its concepts are communicated to wider audiences by news outlets, think tanks and policymakers, it is important to maintain a minimal level of factual correctness in describing its functioning and potential. Inflating the current state of progress or giving erroneous characterisations of its basic principles hinders both the scientific and commercial progress of quantum computation.
Key concepts in quantum computing can have subtle definitions that are difficult to convey using simple metaphors or without employing lengthy explanations. It is partially the duty of the quantum computing research community to come up with better ways to popularise the field.
As better ways to communicate quantum mechanics develop, we should strive to popularise quantum computation in damage control mode. The purpose of this fact sheet is to provide a list of facts that one cannot get wrong when talking or writing about quantum computers. Compliance with the information contained in this list would help one to ensure a minimal level of factual correctness.
The list below mainly focuses on concepts and research topics that have been often popularised in an erroneous way. The list should by no means be considered an introduction to quantum computing.
We believe it is important that the authors of this web page remain accountable. Andrea Rocchetto (UT Austin) wrote this fact sheet, with input from Scott Aaronson (UT Austin).
- Quantum mechanics is a scientific theory that describes atoms and subatomic particles (like electrons or photons). By exploiting physical effects with no classical analogue, quantum computers can efficiently solve certain computational problems that are challenging for classical computers (present or future, big or small). Some of these effects are superposition, interference, and entanglement.
- A qubit is not "0 and 1 at the same time" but a complex linear combination of 0 and 1. In a complex linear combination 0 and 1 are associated to two, possibly complex, coefficients known as amplitudes. The square of an amplitude is proportional to the probability of observing the corresponding value (0 or 1). Amplitudes can interfere with each other; that is, one amplitude can reinforce or cancel the other. By means of interference one can increase the probability of observing the desired value (0 or 1). The concept of superposition has no good translation into everyday English. More on this (with a tentative everyday English translation) in this blog post by Scott Aaronson.
- A quantum computer cannot be thought as equivalent to multiple classical computers that share information and execute calculations in parallel. In this spirit, quantum computers do not get their advantage by checking all possible solutions at the same time.
- n qubits are not the same thing as 2n classical bits — they take 2n classical bits to specify, but you cannot extract that many by measuring them.
- Quantum entanglement is not a profound new axiom of quantum mechanics but just a logical consequence of the actual axioms.
- When talking about the power of quantum computer in abstract terms it makes no sense to ask "is a quantum computer a million times faster than a classical computer? a billion times faster?". The relevant question is "is a quantum computer asymptotically faster than a classical computer for a given problem?". We say that a quantum computer has an asymptotic advantage if its advantage over classical computers becomes greater and greater for larger and larger problem sizes, with the ratio of speeds tending to infinity. It is possible that, even in a task where asymptotically quantum beats classical, for small instances of the problem classical is still better than quantum.
- Quantum computers are not faster than classical computers for every type of computational task but only provide an advantage for certain classes of problems. Three prototypical examples are factoring numbers into primes, searching through unstructured databases, and simulating another quantum system (which is relevant for material and drug discovery). More on this, with bonus reflections on the ethical implications of quantum computing, in this essay by Ronald de Wolf.
- When we say that quantum computers are faster than classical computers — as in the examples listed in the previous bullet point — we typically do not have a mathematical proof that a much faster classical algorithm will not be discovered in the future. At best, this is a widely accepted belief. Notable exceptions are cases where the relevant issue is the number of accesses to the input data (like database search).
- Although quantum computers could break much of the encryption currently used to secure the Internet, there are other forms of encryption that are believed not to be breakable even by a quantum computer.
- Presently, quantum computers capable to function according to our theoretical models do not exist. Current machines are affected by significant levels of error which we are unable to control.
Closer to the news
Machine learning and optimisation
- Using quantum computers to accelerate optimisation and machine learning tasks is an active research area but, as of today, there is no consensus on whether quantum computers can provide more than modest advantages for these problems.
- Read Scott’s Supreme Quantum Supremacy FAQ by Scott Aaronson.
- Quantum supremacy refers to performing a computational task in a reasonable time on a quantum computer, that cannot be performed in a reasonable time by any known algorithm running on a classical computer (when a quantum computer takes 1 second and a classical computer takes 100 years we are confidently in the realm of quantum supremacy).
- Tasks that have been proposed for near-future quantum supremacy experiments are highly specific to that goal. At the moment, their only known application is generating cryptographically certified random numbers.