ORIGINAL ARTICLE A novel error-output recurrent neural network model for time series forecasting Waddah Waheeb 1 • Rozaida Ghazali 1 Received: 6 October 2018 / Accepted: 29 August 2019 Ó Springer-Verlag London Ltd., part of Springer Nature 2019 Abstract It is a well-known fact that improving forecasting accuracy is an important yet often challenging issue. Extensive research has been conducted using neural networks (NNs) to improve their forecasting accuracy. In general, the inputs to NNs are the auto-regressive (i.e. lagged variables) of one or more time series. In addition, either network outputs or network errors have been used as extra inputs to NNs. In this paper, however, we propose a novel recurrent neural network forecasting model which is called the ridge polynomial neural network with error-output feedbacks (RPNN-EOF). RPNN-EOF has two main types of inputs: auto-regressive and moving-average inputs. The former is represented by the lagged variables of a time series, while the latter is represented by feeding back network error to the input layer. In addition, network output is fed back to the input layer. The proposed recurrent model has the ability to produce more accurate forecasts due to the advantages of learning temporal dependence and the direct modelling of the moving-average component. A comparative analysis of RPNN-EOF with five neural network models was completed using ten time series. Simulation results have shown that RPNN-EOF is the most accurate model among all the compared models with the time series used. This shows that employing auto-regressive and moving-average inputs together helps to produce more accurate forecasts. Keywords Recurrent neural network Error-output feedbacks Moving-average NARMA Ridge polynomial neural network Forecasting Nonlinear time series 1 Introduction Time series is a sequence of observations observed sequentially over time [10]. Examples of time series data are the daily exchange rate, monthly sales, and annual runoff. Forecasting time series data are important because of the role of forecasting in helping to make effective and efficient planning [45]. In the field of forecasting, numerous models have been employed for time series forecasting such as adaptive neuro-fuzzy inference system, support vector regression, and neural networks [66, 76, 82]. Neural networks (NNs) have been extensively applied for time series forecasting. A search conducted in September 2018 using Scopus data- base for publications with ‘‘neural network’’ and ‘‘fore- cast’’ terms led to the retrieval of around 10,850 results, with a peak in 2017, the year with most papers issued (with 892 in total). That means NNs are still attracting much interest among scholars to deal with forecasting problems. The reasons for the interest to employ NNs are due to their capability of handling nonlinear functional dependencies, and they are data-driven models with few prior assump- tions about underlying models [61, 87]. Furthermore, NNs are universal function approximators; therefore, they can approximate any continuous function with an arbitrary degree of accuracy [61, 87]. The well-known multilayer perceptron (MLP) neural network is a universal function approximator [20]. How- ever, the number of hidden layers and units must be suf- ficient to deal with the given problem, deciding how many hidden units directly affect the performance of MLP. An MLP of size below the sufficiency usually fails to & Rozaida Ghazali rozaida@uthm.edu.my Waddah Waheeb waddah.waheeb@gmail.com 1 Faculty of Computer Science and Information Technology, Universiti Tun Hussein Onn Malaysia, Batu Pahat, Johor, Malaysia 123 Neural Computing and Applications https://doi.org/10.1007/s00521-019-04474-5