Neurocomputing 61 (2004) 429–437 www.elsevier.com/locate/neucom Letters An activation function adapting training algorithm for sigmoidal feedforward networks Pravin Chandra ∗ , Yogesh Singh School of Information Technology, GGS Indraprastha University, Kashmere Gate, Delhi 110006, India Available online 25 June 2004 Abstract The universal approximation results for sigmoidal feedforward articial neural networks do not recommend a preferred activation function. In this paper a new activation function adapting algo- rithm is proposed for sigmoidal feedforward neural network training. The algorithm is compared against the backpropagation algorithm on four function approximation tasks. The results demon- strate that the proposed algorithm can be an order of magnitude faster than the backpropagation algorithm. c 2004 Elsevier B.V. All rights reserved. Keywords: Feedforward articial neural networks; Sigmoidal activation; Squashing function; Self-adaptation 1. Introduction Sigmoidal feedforward articial neural networks (SFFANNs) with one hidden layer (of arbitrary number of sigmoidal) hidden nodes have been established to be universal approximators of continuous functions [2–5]. The theoretical results for the universal approximation property of SFFANNs do not favour any sigmoidal function (to be used as the activation function of the hidden layer(s)) [1–5,7]. Moreover, these results do not prescribe a methodology for obtaining an approximating network. Generally, the activation function at the hidden layer is chosen arbitrarily and xed before the network is trained. In this paper we propose an algorithm that adapts the activation function itself; the choice of the (nal) activation function is done dependent on the data set used for training and the initial weight * Corresponding author. E-mail addresses: pc ipu@yahoo.com, pchandra@ipu.edu (P. Chandra), ys66@redimail.com, ys@ipu.edu (Y. Singh). 0925-2312/$-see front matter c 2004 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2004.04.001