1350 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 6, NOVEMBER 2004
Feedforward Sigmoidal Networks—Equicontinuity
and Fault-Tolerance Properties
Pravin Chandra, Member, IEEE, and Yogesh Singh, Member, IEEE
Abstract—Sigmoidal feedforward artificial neural networks
(FFANNs) have been established to be universal approximators
of continuous functions. The universal approximation results
are summarized to identify the function sets represented by the
sigmoidal FFANNs with the universal approximation properties.
The equicontinuous properties of the identified sets is analyzed.
The equicontinuous property is related to the fault tolerance
of the sigmoidal FFANNs. The generally used arbitrary weight
sigmoidal FFANNs are shown to be nonequicontinuous sets. A
class of bounded weight sigmoidal FFANNs is established to be
equicontinuous. The fault-tolerance behavior of the networks is
analyzed and error bounds for the induced errors established.
Index Terms—Equicontinuity, fault-tolerance, feedforward ar-
tificial neural networks (FFANNs), function sets, sigmoidal net-
works.
NOMENCLATURE
-dimensional ridge function.
Weight vector associated with the output node,
represents the th component of the vector .
ANNs Artificial neural networks.
BNNs Biological neural networks.
The set of continuous functions defined over a com-
pact set .
The set of finitely bounded continuous functions de-
fined over a compact set .
The set of bounded continuous functions with norm
less than or equal to unity, defined over an appro-
priate compact space .
EIS Equicontinuous in Input Space.
EPS Equicontinuous in Phase Space.
EWS Equicontinuous in Weight Space.
FFANNs Feedforward artificial neural networks.
The output from the th hidden node.
The net input to the th node of the hidden layer.
The output from the FFANN.
Number of training exemplars.
The extended 1-D real space.
The -dimensional real space.
Any sigmoidal function, specifically the log-sig-
moid function.
The function sets represented by FFANNs.
The -dimensional unit sphere.
The set of all adjustable parameters of the network.
Manuscript received March 28, 2003; revised October 6, 2003.
The authors are with the School of Information Technology, GGS In-
draprastha University, Delhi-110006, India (e-mail: pc_ipu@yahoo.com,
pchandra@ipu.edu; ys66@rediffmail.com, ys@ipu.edu).
Digital Object Identifier 10.1109/TNN.2004.831198
The weight between the th input node and the th
node of the hidden layer.
The weight matrix without the pseudoweights cor-
responding to the pseudoinput node (clamped at 1).
The weight matrix with the pseudoweights corre-
sponding to the pseudoinput node (clamped at 1).
The th row of the weight matrix and , respec-
tively. These rows correspond to the weight vectors
associated with the th hidden node.
A compact set (of inputs to the FFANN). Usually
taken to be or .
The dot product of two vectors and
. This definition allows us to write the product
without writing the transpose operator for the vector
in the case of column representation of vectors.
The set of positive integers.
I. INTRODUCTION
A
RTIFICIAL NEURAL NETWORKS (ANNs) are recog-
nized as an alternative paradigm of computation vis-à-vis
the concept of programmed computation in which (usually pro-
cedural) algorithms are designed and sequentially implemented.
ANNs have been inspired by the biological neural networks
(BNNs) [1]–[4], though the current models of ANNs cannot be
called realistic models of BNNs by any stretch of imagination
[4]. The ANN models may be described by the conceptual rela-
tion (1)
(1)
Many models of ANNs representing an architectural spec-
trum and a variety of training/learning paradigms exist [1]–[3].
One of the widely used models is the sigmoidal feedforward
artificial neural networks. These networks have a nonrecurrent
(computational) architecture and learn by employing the super-
vised learning/training paradigm (see [1]–[3] for further details
and references).
The (sigmoidal) feedforward artificial neural networks
(FFANNs) are the subject of study in this paper. Three factors
(out of many) that have contributed significantly to the popu-
larity of the sigmoidal FFANNs are as follows:
1) The universal approximation results concerning sig-
moidal FFANNs.
2) The intuitive appeal and the simplicity (of coding, imple-
menting, and use) of the backpropagation algorithm.
3) The assumed fault-tolerant behavior of (sigmoidal)
FFANNs.
1045-9227/04$20.00 © 2004 IEEE