A synthesis procedure for associative memories based on space-varying cellular neural networks J. Park a, * , H.-Y. Kim b , Y. Park c , S.-W. Lee d a Department of Control and Instrumentation Engineering, Korea University, Chochiwon, Chungnam, 339-800, South Korea b Department of Visual Image Processing, Korea University, Seoul, 136-701, South Korea c Division of Aerospace Engineering, Department of Mechanical Engineering, KAIST, Taejon, 305-701, South Korea d Department of Computer Science and Engineering, Korea University, Seoul, 136-701, South Korea Received 9 March 2000; revised 5 October 2000; accepted 5 October 2000 Abstract In this paper, we consider the problem of realizing associative memories via space-varying CNNs (cellular neural networks). Based on some known results and a newly derived theorem for the CNN model, we propose a synthesis procedure for obtaining a space-varying CNN that can store given bipolar vectors with certain desirable properties. The major part of our synthesis procedure consists of solving general- ized eigenvalue problems and/or linear matrix inequality problems, which can be ef®ciently solved by recently developed interior point methods. The validity of the proposed approach is illustrated by a design example. q 2001 Elsevier Science Ltd. All rights reserved. Keywords: Associative memory; Cellular neural network; Generalized eigenvalue problem; Linear matrix inequality problem 1. Introduction CNNs (cellular neural networks) as introduced by Chua and Yang (1988) are a special class of continuous-time feed- back neural networks which consist of cells connected only to their neighborhood cells. CNNs are well suited for analog very-large-scale integration implementation due to this local interconnection property, and have found many appli- cations in a variety of areas (see, e.g. Roska & Vandewalle, 1995, and references therein). In this paper, we consider the problem of realizing associative memories via space-vary- ing CNNs, in which the cells have local interconnections with identical neighborhood size, but with cell dependent connection weights. Since Hop®eld (1982) showed that fully interconnected feedback neural networks trained by the Hebbian learning rule can function as a new concept of associative memories, the synthesis of associative memories using neural network models has attracted a great deal of interest among researchers (see, e.g. Hassoun, 1993). Recently, associative memories based on CNNs also have been studied. Generally, the goal of the CNN memories is to store desired bipolar vectors as memory vectors of the network so that when a vector suf®ciently close to a stored bipolar vector is applied as an initial condi- tion of the network, the stored bipolar vector may be retrieved as the ®nal output of the network. In order to design space-varying CNNs that can achieve this goal, several methods have been proposed. Liu and Michel (1993) proposed a singular value decomposition-based synthesis procedure. This procedure is a modi®cation of the synthesis method for neural associative memories (Li, Michel & Porod, 1989), and is often called `the eigenstruc- ture method'. Seiler, Schuler, and Nossek (1993) developed an algorithm to design a space-varying CNN with prescribed stable and unstable output patterns while maxi- mizing its robustness with respect to changes of its para- meters. The algorithm consists of formulating and solving a set of linear inequalities using linear programming. Liu and Lu (1997) showed that the perceptron training algorithm can be applied to design space-varying CNNs as well as fully connected feedback neural networks. Chan and Zak (1998) proposed `designer' neural network for the synthesis of associative memories based on a class of discrete cellular neural networks. Perfetti (1999) presented a local learning algorithm which can ef®ciently be implemented on chip by exploiting the parallel analog computation of CNNs. In Park and Park (2000) we proposed a GEVP (generalized eigen- value problem)-based synthesis procedure for space-varying CNNs. The procedure yields the CNN memories whose connection weight matrices are symmetric with their diag- onal elements ®xed at unity. Thus, it is not applicable when Neural Networks 14 (2001) 107±113 PERGAMON Neural Networks 0893-6080/01/$ - see front matter q 2001 Elsevier Science Ltd. All rights reserved. PII: S0893-6080(00)00086-1 www.elsevier.com/locate/neunet * Corresponding author. Tel.: 182-41-860-1444; fax: 182-41-865-1820. E-mail address: jpark@tiger.korea.ac.kr (J. Park).