B. Murgante et al. (Eds.): ICCSA 2013, Part I, LNCS 7971, pp. 427–437, 2013.
© Springer-Verlag Berlin Heidelberg 2013
Functional Link Neural Network – Artificial Bee Colony
for Time Series Temperature Prediction
Yana Mazwin Mohmad Hassim and Rozaida Ghazali
Faculty of Computer Science and Information Technology
Universiti Tun Hussein Onn Malaysia (UTHM),
86400 Batu Pahat, Johor, Malaysia
{yana,rozaida}@uthm.edu.my
Abstract. Higher Order Neural Networks (HONNs) have emerged as an
important tool for time series prediction and have been successfully applied in
many engineering and scientific problems. One of the models in HONNs is a
Functional Link Neural Network (FLNN) known to be conveniently used for
function approximation and can be extended for pattern recognition with faster
convergence rate and lesser computational load compared to ordinary
feedforward network like the Multilayer Perceptron (MLP). In training the
FLNN, the mostly used algorithm is the Backpropagation (BP) learning
algorithm. However, one of the crucial problems with BP learning algorithm is
that it can be easily gets trapped on local minima. This paper proposed an
alternative learning scheme for the FLNN to be applied on temperature
forecasting by using Artificial Bee Colony (ABC) optimization algorithm. The
ABC adopted in this work is known to have good exploration and exploitation
capabilities in searching optimal weight especially in numerical optimization
problems. The result of the prediction made by FLNN-ABC is compared with
the original FLNN architecture and toward the end we found that FLNN-ABC
gives better result in predicting the next-day ahead prediction.
Keywords: Temperature prediction, Functional Link Neural Network, Artificial
Bee Colony Algorithm.
1 Introduction
Artificial Neural Networks (ANNs) have been known to be successfully applied in a
variety of real world tasks includes prediction, classification, signal processing, image
recognition and especially in industry, business and science [1, 2]. The most common
architecture of ANNs is the Multi-layer feed forward network known as Multilayer
perceptron (MLP). Since the MLP has multilayered structure, the network requires
excessive training time for learning [3]. This is because, the number of weight and the
training time will increase as the number of layers and the nodes in layer increases [3,
4]. In order to overcome the drawback of MLP, another type of network known as
Higher Order Neural Networks (HONNs) have been introduced [5]. HONNs are a type
of feed forward neural network which have single layer trainable weights that can help
brought to you by CORE View metadata, citation and similar papers at core.ac.uk
provided by UTHM Institutional Repository