GFAM: A Genetic Algorithm Optimization of Fuzzy ARTMAP
A. Al-Daraiseh, M. Georgiopoulos, G. Anagnostopoulos, A. S. Wu, M. Mollaghasemi
Abstract— Fuzzy ARTMAP (FAM) is currently considered
to be one of the premier neural network architectures in
solving classification problems. One of the limitations of Fuzzy
ARTMAP that has been extensively reported in the literature is
the category proliferation problem. That is Fuzzy ARTMAP
has the tendency of increasing its network size, as it is
confronted with more and more data, especially if the data is of
noisy and/or overlapping nature. To remedy this problem a
number of researchers have designed modifications to the
training phase of Fuzzy ARTMAP that had the beneficial effect
of reducing this phenomenon. In this paper we propose a new
approach to handle the category proliferation problem in
Fuzzy ARTMAP by evolving trained FAM architectures. We
refer to the resulting FAM architectures as GFAM. We
demonstrate through extensive experimentation that an evolved
FAM (GFAM) exhibits good generalization, small size, and
produces an optimal or a good sub-optimal network with a
reasonable computational effort. Furthermore, comparisons of
the GFAM with other approaches, proposed in the literature,
that address the FAM category proliferation problem, illustrate
that the GFAM has a number of advantages (i.e. produces
smaller or equal size architectures, of better or as good
generalization, with reduced computational complexity).
I.INTRODUCTION
HE Adaptive Resonance Theory (ART) was developed
by Grossberg (1976). One of the most celebrated ART
architectures is Fuzzy ARTMAP (Carpenter et al, 1992),
which has been successfully used in the literature for solving
a variety of classification problems. One of the limitations of
Fuzzy ARTMAP (FAM) that has been repeatedly reported in
the literature is the category proliferation problem, which is
tightly connected with the issue of overtraining.
Manuscript received January 31, 2006. This work was supported in part
by the National Science Foundation (NSF) under grants CRCD 0203446
and CCLI 0341601.
A. Al-Daraiseh is with the School of Electrical Engineering and
Computer Science, University of Central Florida, Orlando, FL 32816, USA
(e-mail: creepymaster@yahoo.com ).
M. Georgiopoulos is with the School of Electrical Engineering and
Computer Science, Orlando, FL 32816, USA (phone: (407) 823-5338, fax:
(407) 823 5835; e-mail: michaelg@mail.ucf.edu ).
G. Anagnostopoulos is with the Department of Electrical and Computer
Engineering, Florida Institute of Technology, Melbourne, FL 32901, USA
(e-mail: georgio@fit.edu ).
A. S. Wu is with the School of Electrical Engineering and Computer
Science, University of Central Florida, Orlando, FL 32816, USA (e-mail:
aswu@cs.ucf.edu ).
M. Mollaghasemi is with the Department of Industrial Engineering and
Management Systems, University of Central Florida, Orlando, FL 32816,
USA (e-mail: mollagha@mail.ucf.edu ).
A number of authors have tried to address the category
proliferation/overtraining problem in Fuzzy ARTMAP.
Amongst them we refer to the work by Verzi, et al., 2001,
Anagnostopoulos, et al., 2003 and Gomez-Sanchez, et al.,
2001, where different methods were introduced and
evaluated, that allow Fuzzy ARTMAP categories to encode
patterns that are not necessarily mapped to the same label.
In this paper, we propose the use of genetic algorithms
(Goldberg, 1989) to solve the category proliferation problem
in Fuzzy ARTMAP. Genetic algorithms (GAs) are a class of
population-based stochastic search algorithms that are
developed from ideas and principles of natural evolution. An
important feature of these algorithms is their population
based search strategy. Individuals in a population compete,
modify and exchange information with each other in order to
perform certain tasks. Our approach starts with a population
of trained FAMs. GA operators are then utilized to
manipulate these trained FAM architectures in a way that
encourages better generalization and smaller size
architectures. The evolution of trained FAM architectures
allows these architectures to exchange and modify their
categories in a way that emphasizes smaller and more
accurate FAM architectures. Eventually, this process leads
us to a FAM architecture (referred to as GFAM) that has
good generalization performance and creates networks of
small size; all of these benefits come with the additional
advantage of reasonable computational complexity.
Genetic algorithms have been extensively used to evolve
artificial neural networks. For a thorough exposition of the
available research literature in evolving neural networks the
interested reader is advised to consult Yao, 1999. To the best
of our knowledge there is no work conducted in the literature
so far that has attempted to evolve FAM neural network
structures, and that is the main focus of our effort.
The organization of this paper is as follows: In section 2
we present GFAM. In Section 3, we describe the
experiments and the datasets used to assess the performance
of GFAM, and we also compare GFAM to four other ART
networks that attempted to resolve the category proliferation
problem in Fuzzy ARTMAP. Finally, in Section 4, we
summarize our work.
T
0-7803-9489-5/06/$20.00/©2006 IEEE
2006 IEEE International Conference on Fuzzy Systems
Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada
July 16-21, 2006
315