Hardware elementary perceptron based on polyaniline memristive devices V.A. Demin a,b, , V.V. Erokhin a,c, , A.V. Emelyanov a , S. Battistoni c , G. Baldi c , S. Iannotta c , P.K. Kashkarov a,b,d,e , M.V. Kovalchuk a,b,d,e a National Research Centre ‘‘Kurchatov Institute’’, 123182 Moscow, Russia b Moscow Institute of Physics and Technology (State University), 141700 Dolgoprudny, Moscow Region, Russia c CNR-IMEM (National Research Council, Institute of Materials for Electronics and Magnetism) and University of Parma, VialeUsberti 7A, 42124 Parma, Italy d Physics Department, Lomonosov Moscow State University, GSP-1, Leninskie Gory, 119991 Moscow, Russia e Physics Department, Saint Petersburg State University, Saint Petersburg 199034, Russia article info Article history: Received 21 April 2015 Received in revised form 6 June 2015 Accepted 8 June 2015 Available online 9 June 2015 Keywords: Memristor Perceptron Pattern classification Machine learning Polyaniline Neuromorphic computing abstract Elementary perceptron is an artificial neural network with a single layer of adaptive links and one output neuron that can solve simple linearly separable tasks such as invariant pattern recognition, linear approx- imation, prediction and others. We report on the hardware realization of the elementary perceptron with the use of polyaniline-based memristive devices as the analog link weights. An error correction algorithm was used to get the perceptron to learn the implementation of the NAND and NOR logic functions as examples of linearly separable tasks. The physical realization of an elementary perceptron demonstrates the ability to form the hardware-based neuromorphic networks with the use of organic memristive devices. The results provide a great promise toward new approaches for very compact, low-volatile and high-performance neurochips that could be made for a huge number of intellectual products and applications. Ó 2015 Elsevier B.V. All rights reserved. 1. Introduction Perceptron is an artificial neural network capable of supervised learning in a variety of tasks among which there are pattern recog- nition and classification, approximation, prediction, and others. In many cases these tasks are mathematically ill-posed problems (due to incomplete or distorted input data), solution of which allows to navigate in a real environment. They can be relatively easy solved by human but are quite difficult for resolving by common computers. Perceptron was developed by Rosenblatt in 1957 [1] as a model of brain perception. Historically, single-layer perceptron was represented by three layers of neurons: sensory, associative, and responsive ones. Wherein, only the series of links from the associative to responsive neurons could be trained (varied in the process of the network learning), while the first layer of weights from sensory neurons to associative ones was considered accidentally or deterministically given and remained unchanged in the learning process. Later on, single-layer perceptron (e.g., in the terminology of Wasserman [2]) became known as a network in which each sensory neuron actually was directly connected to a single neuron of associative layer. Thus, the need to assign a set of sensory neu- rons disappeared, and the only layer of variable links between associative and responsive neurons was held. Note that the classic single-layer Rosenblatt perceptron is not functionally equivalent to the single layer Wasserman perceptron, just because of the pres- ence of the branch points (sensory neurons) for the input signals. For example, a single-layer perceptron in terms of Rosenblatt is able to solve so-called non-linearly separable problem, and in terms of Wasserman does not. Because of this terminological con- fusion and also due to the critical book of Minsky and Papert [3] there were some misconceptions concerning the limitations of the classical perceptron. As a result, the research in this area was paused for almost 15 years. Nevertheless, a number of mathemat- ical developments, concerning especially the learning algorithms of a multilayered perceptron (with several layers of associative neurons) [4,5], warmed up the interest in artificial neural networks (ANN) that it is still growing. In recent years, the development and fabrication of a nanostruc- tured solid-state analog of the synapse, so-called memristor, has become a new factor of growing research intensity in the field of http://dx.doi.org/10.1016/j.orgel.2015.06.015 1566-1199/Ó 2015 Elsevier B.V. All rights reserved. Corresponding authors at: National Research Centre ‘‘Kurchatov Institute’’, 123182 Moscow, Russia. E-mail addresses: demin.vyacheslav@mail.ru (V.A. Demin), victor.erokhin@fis. unipr.it (V.V. Erokhin). Organic Electronics 25 (2015) 16–20 Contents lists available at ScienceDirect Organic Electronics journal homepage: www.elsevier.com/locate/orgel