arXiv:2003.09569v2 [quant-ph] 11 Dec 2020 Universal quantum reservoir computing Sanjib Ghosh, 1, ∗ Tanjung Krisnanda, 1 Tomasz Paterek, 1, 2, 3 and Timothy C. H. Liew 1, 2, † 1 School of Physical and Mathematical Sciences, Nanyang Technological University 637371, Singapore 2 MajuLab, International Joint Research Unit UMI 3654, CNRS, Université Côte d’Azur, Sorbonne Université, National University of Singapore, Nanyang Technological University, Singapore 3 Institute of Theoretical Physics and Astrophysics, Faculty of Mathematics, Physics and Informatics, University of Gdańsk, 80-308 Gdańsk, Poland We show that quantum reservoir neural networks offer an alternative paradigm for universal quantum computing. In this framework, a dynamical random quantum network, called the quantum reservoir, is used as a quantum processor for performing operations on computational qubits. We show various quantum operations including a universal set of quantum gates, which can be obtained with a single quantum reservoir network with different tunnelling amplitudes between the reservoir and the qubits. The same platform can also implement non-unitary quantum gates, which are useful to simulate open quantum systems with tuneable parameters. In the era of big data and machine learning, neuromorphic computing is rapidly emerg- ing as an alternative platform for computation and data processing. While conventional computers rely on predetermined algorithms for performing tasks, neuromorphic computers use artificial neural networks, which are flexible and can learn from example in analogy to a biological brain. Their resilience allows them to be versatile in applications and adaptive to practical situations. For instance, neural networks are used across disciplines for a mul- titude of tasks [1–7], and are capable of extracting features from noisy [8, 9] or incomplete data [10, 11], and perturbed systems [12]. An artificial neural network is a system of interconnected nonlinear nodes capable of modelling complex mapping between input and output data. A given map is formed by carefully adjusting the connection weights between the nodes during a training procedure.