Thresholded Learning Matrix for Efficient Pattern Recalling Mario Aldape-Pérez, Israel Román-Godínez, and Oscar Camacho-Nieto Center for Computing Research, CIC National Polytechnic Institute, IPN Mexico City, Mexico mario@aldape.org.mx, iromanb05@sagitario.cic.ipn.mx, oscarc@cic.ipn.mx http://www.aldape.org.mx Abstract. The Lernmatrix, which is the first known model of asso- ciative memory, is a heteroassociative memory that can easily work as a binary pattern classifier if output patterns are appropriately chosen. However, this mathematical model undergoes fundamental patterns mis- classification whenever crossbars saturation occurs. In this paper, a novel algorithm that overcomes Lernmatrix weaknesses is proposed. The cross- bars saturation occurrence is solved by means of a dynamic threshold value which is computed for each recalled pattern. The algorithm ap- plies the dynamic threshold value over the ambiguously recalled class vector in order to obtain a sentinel vector which is used for uncertainty elimination purposes. The efficiency and effectiveness of our approach is demonstrated through comparisons with other methods using real-world data. Keywords: Associative Memories, Dynamic Threshold, Lernmatrix, Pattern Classification, Supervised Learning. 1 Introduction Karl Steinbuch, a pioneer of artificial neural networks, was one of the first researchers that developed functional structures (square arrays known as cross- bars) which use conditional connections for adaptation purposes in categoriza- tion tasks [1]. Lernmatrix transcendence is evidenced by Kohonen’s statement, where he points out that Correlation Matrices substitute Steinbuch’s Lernmatrix [2]. The Lernmatrix, which is the first known model of associative memory, is a heteroassociative memory that can easily work as a binary pattern classifier if output patterns are appropriately chosen [3]. Nonetheless, complete recall is not guaranteed whenever crossbars saturation occurs [4]. In the following section, a brief description of associative memories fundamentals is presented. In Section 3, Thresholded Learning Matrix foundations are presented. An illustrative ex- ample is presented in Section 4 while in Section 5 some experimental results are shown using real-world data. The Thresholded Learning Matrix advantages, as well as a short conclusion will be discussed in Section 6. J. Ruiz-Shulcloper and W.G. Kropatsch (Eds.): CIARP 2008, LNCS 5197, pp. 445–452, 2008. c Springer-Verlag Berlin Heidelberg 2008