An approach to reservoir computing design and training Aida A. Ferreira a,b,⇑ , Teresa B. Ludermir a , Ronaldo R.B. de Aquino a a Federal University of Pernambuco (UFPE), P.O. Box 7851, Cidade Universitria, Cep: 50.740-530 Recife, PE, Brazil b Federal Institute of Science, Technology and Education of Pernambuco, Av Professor Luis Freire, 500, Cidade Universitria, Cep: 50.740-530 Recife, PE, Brazil article info Keywords: Reservoir computing Echo state networks Evolutionary algorithm abstract Reservoir computing is a framework for computation like a recurrent neural network that allows for the black box modeling of dynamical systems. In contrast to other recurrent neural network approaches, res- ervoir computing does not train the input and internal weights of the network, only the readout is trained. However it is necessary to adjust parameters to create a ‘‘good’’ reservoir for a given application. In this study we introduce a method, called RCDESIGN (reservoir computing and design training). RCDE- SIGN combines an evolutionary algorithm with reservoir computing and simultaneously looks for the best values of parameters, topology and weight matrices without rescaling the reservoir matrix by the spectral radius. The idea of adjust the spectral radius within the unit circle in the complex plane comes from the linear system theory. However, this argument does not necessarily apply to nonlinear systems, which is the case of reservoir computing. The results obtained with the proposed method are compared with results obtained by a genetic algorithm search for global parameters generation of reservoir com- puting. Four time series were used to validate RCDESIGN. Ó 2013 Elsevier Ltd. All rights reserved. 1. Introduction Theoretically, recurrent neural networks (RNNs) are very pow- erful tools for solving complex temporal machine learning tasks. Nonetheless, several factors still hinder the larger scale deploy- ment of RNNs in practical applications. There are few learning rules and most suffer from slow convergence rates, thus limiting their applicability (Verstraeten, Schrauwen, D’Haene, & Stroobandt, 2007). In 2001, a new approach to RNN design and training was proposed independently under the name of Liquid State Machines (Maass, Natschlager, & Markram, 2002) and under the name of Echo State Networks (Jaeger, 2001). This approach, which had pre- decessors in computational neuroscience (Dominey, 1995) and subsequent ramifications in machine learning as the Backpropaga- tion–Decorrelation (Steil, 2004) learning rule, is now often referred to as reservoir computing (RC) (Lukosevicius & Jaeger, 2009). The basic concept is to randomly construct an RNN and leave the weights unchanged. A separate linear regression function is trained on the reservoir’s response to the input signals using a linear regression. The underlying idea is that a randomly constructed res- ervoir offers a complex nonlinear dynamic transformation of the input signals which allows the readout to extract the desired out- put using a simple linear mapping. RC offers an intuitive method- ology for using the temporal processing power of RNNs without the hassle of training them (Schrauwen, Defour, Verstraeten, & Campenhout, 2007b). This study’s proposal is an approach to reservoir computing de- sign and training using an evolutionary strategy. Although the RC optimization is a challenge, on the other hand, checking the perfor- mance of a result system is relatively inexpensive. This makes evo- lutionary methods, for reservoir pre–training, a natural strategy for searching the best model for any task (Lukosevicius & Jaeger, 2009). Several evolutionary approaches to ESN reservoir optimiza- tion have been presented (Bush & Tsendjav, 2005; Ishii, van der Zant, Becanovic, & Ploger, 2004), however, they used the idea of separating the topology and reservoir weights in order to reduce the search space and they also used the condition of search for a spectral radius smaller than 1 to guarantee the echo state property. The evolutionary strategy, adopted in this work, simultaneously searches for the best values of the reservoir global parameters, the best topology and the reservoir matrix, without the limitation shown on previous works of reducing the search space and without rescaling the matrices by the spectral radius. All experiments were implemented in MATLAB and RCToolbox (Schrauwen, Verstraeten, & Haene, 2007a). The genetic vector used in the proposed evolu- tionary algorithm is bigger than the other evolutionary approaches on ESN reservoir optimization. This study is organized as follows. In Section 2 we provide an overview of reservoir computing. The echo state property is explained in Section 3. Section 4 presents the motivation and the 0957-4174/$ - see front matter Ó 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.eswa.2013.01.029 ⇑ Corresponding author at: Federal University of Pernambuco (UFPE), P.O. Box 7851, Cidade Universitria, Cep: 50.740-530 Recife, PE, Brazil. Tel.: +55 81 21268986. E-mail addresses: aidaaf@gmail.com (A.A. Ferreira), tbl@cin.ufpe.br (T.B. Luder- mir), rrba@ufpe.br (R.R.B. de Aquino). Expert Systems with Applications 40 (2013) 4172–4182 Contents lists available at SciVerse ScienceDirect Expert Systems with Applications journal homepage: www.elsevier.com/locate/eswa