49th International Symposium ELMAR-2007, 12-14 September 2007, Zadar, Croatia Adaptive Polynomial Neural Networks for Times Series Forecasting Panos Liatsis 1 , Amalia Foka 2 , John Yannis Goulermas 3 , Lidija Mandic 4 1 School of Engineering and Mathematical Sciences, City University, Northampton Square, London EC1V 0HB, UK 2 Department of Computer Science, University of Ioannina, PO Box 1186, 45110 Ioannina, Greece 3 Department of Electrical Engineering and Electronics, University of Liverpool, Brownlow Hill, Liverpool L69 3GJ, UK 4 University of Zagreb, Faculty of Graphic Arts, Getaldiceva 2, Zagreb, Croatia E-mail: p.liatsis@city.ac.uk Abstract - Time series prediction involves the determination of an appropriate model, which can encapsulate the dynamics of the system, described by the sample data. Previous work has demonstrated the potential of neural networks in predicting the behaviour of complex, non-linear systems. In particular, the class of polynomial neural networks has been shown to possess universal approximation properties, while ensuring robustness to noise and missing data, good generalisation and rapid learning. In this work, a polynomial neural network is proposed, whose structure and weight values are determined with the use of evolutionary computing. The resulting networks allow an insight into the relationships underlying the input data, hence allowing a qualitative analysis of the models’ performance. The approach is tested on a variety of non-linear time series data. Keywords - Genetic Algorithms, Polynomial Neural Networks, Time Series, Forecasting 1. INTRODUCTION Neural networks (NNs) attempt to simulate the structure and functionality of the human brain. They are massively parallel networks of layers of simple interconnected units, called neurons. Neural networks are used in a variety of application areas because they are able to model complex non-linear mappings, by adapting their parameters (i.e., topology and/or weights), while demonstrating fault tolerance. A popular class of NNs is that of feedforward networks, where information flows from the input to the output layers. Within the class of feedforward networks, there is a categorisation between first-order and higher-order (or polynomial) neural networks. In essence, first-order neural networks process weighted sums of the input data, while polynomial neural networks use higher- order combinations or functions of the data, hence providing for suitable non-linear expansions of the representation space of the problem. Evolutionary computing (EC) is based on processes observed in natural evolution. EC methods are based on the Darwinian principles of the survival of the fittest. Darwinian evolution is a robust search and optimisation strategy. Such approach can be applied to problems, where heuristic solutions are unavailable or simply lead to unsatisfactory results. It operates within a population of individuals, which are initially randomly selected. The individuals of a population represent the potential solutions to a particular problem. The initial population evolves towards successively better solutions by using the processes of reproduction, crossover and mutation. The fitness value of an individual gives a measure of its performance for the given problem In this work, evolving polynomial neural networks (EPNNs) are applied to the time series prediction problem. Traditional approaches to time series prediction are based on either finding the law underlying the actual physical process or on discovering some strong empirical regularities in the observation of the time series. In the first case, if the law can be discovered and analytically described, for instance, by a set of differential equations, then by solving them, we can predict the future evolution of the time series, given that the initial conditions are known. The disadvantage of this approach is that normally only partial information is known about the dynamical process. In the second case, if the time series consists of components of periodic processes, it is possible to model it by the superposition of sinusoids. In real- world problems however, regularities such as periodicity are masked by noise, and some phenomena are described by chaotic time series, where the data seem random with no apparent periodicities. In Section 2, we discuss some types of polynomial neural networks (PNNs), and explain the fundamental differences between traditional neural networks and PNNs. Next, the architecture of the EPNN is introduced. We describe aspects related to representation, fitness evaluation as well as the basic genetic operators. Section 4 shows the application of EPNN to the problem of time series forecasting. Finally, we draw the conclusions of this work and suggest avenues for further work. 35