Will We Ever Have a Quantum Computer?
M.I. Dyakonov
1
Laboratoire Charles Coulomb, Université Montpellier, France
Abstract. In the hypothetical quantum computing one replaces the classical two-
state bit by a quantum element (qubit) with two basic states, ↑ and ↓. Its arbitrary
state is described by the wave function ψ = a↑+ b↓, where a and b are complex
amplitudes, satisfying the normalization condition. Unlike the classical bit, that
can be only in one of the two states, ↑ or ↓, the qubit can be in a continuum of
states defined by the quantum amplitudes a and b. The qubit is
a continuous object. At a given moment, the state of a quantum computer
with N qubits is characterized by 2
N
quantum amplitudes, which are continuous
variables restricted by the normalization condition only. Thus, the hypothetical
quantum computer is an analog machine characterized by a super-astronomical
number of continuous variables (even for N~100÷1000). Their values cannot be
arbitrary, they must be under our control. Thus the answer to the question in title
is: When physicists and engineers will learn to keep under control this number of
continuous parameters, which means - never.
Keywords. Quantum computing, qubits
1. Introduction
The idea of quantum computing was first put forward in a rather vague form by the
Russian mathematician Yuri Manin in 1980. In 1981 it was independently proposed
(also in a vague form) by Richard Feynman. Realizing that (because of the exponential
increase of the number of quantum states) computer simulations of quantum systems
become impossible when the system is large enough, he advanced the idea that to make
them efficient the computer itself should operate in the quantum mode: “Nature isn’t
classical and if you want to make a simulation of Nature, you’d better make it quantum
mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy”.
David Deutsch in 1985, formally described the universal quantum computer, as a
quantum analog of the Universal Turing machine.
The subject did not attract much attention until Peter Shor in 1994 proposed an
algorithm allowing to factor very large numbers on an ideal quantum computer much
faster compared to the conventional (classical) computer. This outstanding theoretical
result has triggered an explosion of general interest in quantum computing and many
thousands of research papers, mostly theoretical, have been and still continue to be
published at an increasing rate.
1
Laboratoire Charles Coulomb, Université Montpellier, cc 070, 34095 Montpellier, France
michel.dyakonov@gmail.com
Parallel Computing: Technology Trends
I. Foster et al. (Eds.)
© 2020 The authors and IOS Press.
This article is published online with Open Access by IOS Press and distributed under the terms
of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0).
doi:10.3233/APC200019
11