NeuralNetworks, Vol. 4, pp. 599-613, 199l 0893-6080/91 $3AX) + .00 Printed in the USA. All rights reserved. Copyright '~ 1991 Pergamon Press pk" ORIGINAL CONTRIBUTION Exponential Stability and a Systematic Synthesis of a Neural Network for Quadratic Minimization SUBRAMANIA I. SUDHARSANAN* AND MALUR K. SUNDARESHAN University of Arizona (Received 5 February 1990; revised and accepted 23 January 1991) Abstract--A continuous-time network with piecewise linear neuron input-output characteristics is proposed for optimization applications. Certain qualitative properties of the network of fundamental importance in these applications, such as the uniqueness of equilibrium conditions and the global exponential stability with any arbitrarily prescribed degree of this equilibrium, are analytically investigated. Deriving guidance from the obtained analytical results, a systematic synthesis procedure is outlined for identifying the network parameters and the bias inputs to employ the neural network for efficiently solving the important class of optimization problems where the objective is to minimize a specified quadratic function in the decision variables. For demonstrating the versatility of the solution procedure, three illustrative applications, namely synthesis of a class of spatial filters popularly employed in image recognition, the design of an associative memory by a master-slave for- mulation and the estimation of parameters of a linear system by a least squares procedure are outlined and the superiority of the present approach over the existing results is indicated. Some of the present results concerning the characterization of the network equilibrium conditions and the network scaling for confining the equilibrium to desired operational ranges are of basic interest and are useful in other applications of the neural network besides the specific applications to solve optimization problems discussed in this paper. Keywords--Neural network synthesis, Quadratic minimization, Equilibrium characterization, Exponential stability, Neural network dynamics, Optimization applications, Qualitative analysis of network properties, Global convergence of network solutions. 1. INTRODUCTION In the synthesis of dynamical neural networks, en- suring a rapid convergence of the unforced network trajectories starting from arbitrary initial states to the equilibrium conditions is of particular impor- tance. A characterization of the network equilibria and a rigorous analysis of the qualitative properties of the network are of great interest. Results of such investigations for various types of neural networks have been reported in several recent works (Gross- berg 1982; Cohen & Grossberg, 1983; Hopfield, 1984; Li, Michel, & Porod, 1988; Michel, Farrel, & Porod, 1989; Dimopoulos, 1989). Such qualitative studies of the convergence properties are highly useful in serv- ing as guidelines for developing systematic synthesis procedures and also to identify appropriate policies for training the neural network. Depending on the eventual use of the network being constructed either for application as a content addressable memory or for such applications as op- timization and nonlinear input-output mapping, dif- * S. I. Sudharsanan is currently with the Advanced Devel- opment Systems Design, IBM Corporation, Boca Raton, FL 33429. Requests for reprints should be sent to Prof. M. K. Sunda- reshan, Dept. of Electrical and Computer Engineering, University of Arizona, Tucson, AZ 85721. ferent types of questions are to be examined in such a qualitative study. For instance, in the synthesis of a neural network to serve as an associative memory, the analysis of the recall abilities of a set of stored vectors in the state space is of importance and hence, one is interested in designing a network with multiple equilibrium points each of which corresponds to a vector to be stored. Furthermore, the network pa- rameters should satisfy appropriate conditions to en- sure the local stability of each equilibrium point such that the network trajectory starting at an initial state within the basin of attraction of the equilibrium point (representing a partial memory cue) rapidly con- verges to this equilibrium point. On the other hand, in optimization and nonlinear input-output mapping applications, the nature of the equilibrium and its stability properties of interest are significantly dif- ferent. In particular, one desires to construct a net- work with a unique equilibrium point which is a glob- ally stable attractor state. For employing the network to compute the global minimum of a specified ob- jective function in optimization applications, one can then resort to a strategy of associating this unique equilibrium point with the global minimum sought (thus preventing convergence to the local minima of the objective function). Similarly, for a nonlinear mapper, since the equilibrium condition is specified 599