A Wavelet-Based Recurrent Fuzzy Neural
Network Trained With Stochastic Optimization
Algorithm
Ahmad T.AbdulSadda, PhD. Student,
Department of Applied Science,Systems Engineering,
College of Engineering and Information Technology
(EIT), University of Arkansas at Little Rock(UALR)
email:atabdulsadda@ualr.edu
Kameran Iqbal, Associate Professor,
Department of Systems Engineering, College of
Engineering and Information Technology (EIT),
University of Arkansas at Little Rock (UALR)
email:kxiqbal@ualr.edu
Abstract— this paper presents a Wavelet-based Recurrent Fuzzy
Neural Networks (WRFNN) trained with a stochastic search-
based adaptation algorithm. A WRFNN represents a recurrent
network of neurons employing wavelet functions whose outputs
are combined using fuzzy rules. In this paper an earlier WRFNN
model proposed by Lin, and Chin, [1], is modified by application
of Simultaneously Perturbed Stochastic Approximation (SPSA)
method for training the network. The model includes TSK-type
fuzzy implication to compute output of each layer. The SPSA
algorithm was shown to be a stable global optimization technique
that is applicable to WRFNN models with demonstrated
computational advantages over other optimization algorithms.
Keywords—neural networks, fuzzy-wavelet, simultaneous
perturbation algorithm.
I. INTRODUCTION
Recently, fuzzy neural networks have demonstrated to
be successful in a variety of applications [1]–[4]. Two
common types of fuzzy neural works are: Mamdani-type and
TSK-type fuzzy neural networks. For Mamdani-type fuzzy
neural networks [3], [4], the minimum fuzzy implication is
used in fuzzy reasoning. Whereas, for TSK-type fuzzy neural
networks [5], the consequence of each rule represents a
function input variables. The generally adopted function is a
linear combination of input variables plus a constant term.
Researchers [6], have shown that compared to Mamdani-type
fuzzy neural networks, a TSK-type fuzzy neural network is
capable of achieving superior performance in network size and
learning accuracy.
A recurrent neural network, which naturally involves
dynamic elements in the form of feedback connections, and
may be used as internal memory, has recently attracted great
interest [7]–[9]. For example, Elman, [7], networks comprise
feed forward multilayer perceptron networks with an extra set
of context nodes for copying the delayed states of the hidden
or output nodes back to the network input. The radial basis
function recurrent networks [8] were proposed to make the
network output history-sensitive. Similarly, Jin et al. [9]
studied the approximation of continuous-time dynamic
systems using dynamic recurrent neural networks (DRNN).
The simultaneously perturbed stochastic
approximation (SPSA) algorithm was proposed by Spall
(1988, 1992), [10], which is based on a highly efficient
gradient approximation techniques (requiring only two
measurements of a scalar differentiable loss function). SPSA
algorithm belongs to class of iterative gradient-free
algorithms, [10]-[12], that have been effectively used for
multivariate nonlinear optimization of complex system when
an accurate system model is not available. Under reasonably
general conditions, SPSA and the standard finite-difference
stochastic analysis methods achieve the same level of
statistical accuracy for a given number of iterations, even
though SPSA uses p times fewer measurements of the
objective function at each iteration (since each gradient
approximation uses only 1/p the number of function
measurements) [10].
This paper discusses the application of WRFNN
trained by SPSA algorithm. The paper is organized as follows.
In section 2 the model of wavelet neural networks will be
described. In section 3 structure of the wavelet-based recurrent
fuzzy neural network model will be given. In Section 4
simultaneous perturbation method will be explained. In section
5 problem formulations will be described. Section 6, and
section 7consists of illustrative example and conclusion.
II. WAVELET BASES AND WAVELET NEURAL NETWORKS
A set of wavelet bases is a suitable tool for effectively
representing nonlinearity. These orthogonal wavelets are
infinite, continuous and differentiable. The support of these
wavelets is - < x < . Daubechies, [13], presented wavelet
bases, which are compactly supported but not infinitely
supported. Rather than proposing a three-layered feed forward
neural network, Daubechies proposed a simple wavelet neural
network, which exhibits a much higher ability to generalize
and much shorter learning time. This study adopts the non
orthogonal and compactly supported functions in the finite
range as wavelet bases. All the wavelet bases are allocated
over the normalized range [0, 1] on the variable space.
Neural networks employing wavelet neurons are
refereed to Wavelet Neural Networks (WNN). The WNN are
Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics
San Antonio, TX, USA - October 2009
978-1-4244-2794-9/09/$25.00 ©2009 IEEE
4189