Abstract Motivated by a biologically plausible short-memory sketchpad, Kak’s Fast Classification (FC) neural networks are instantaneously trained by using a prescriptive training scheme. Both weights and the topology for an FC network are specified with only two presentations of the training samples. Compared with iterative learning algorithms such as Backpropagation (which may require many thousands of presentations of the training data), the training of FC net- works is extremely fast and learning convergence is always guaranteed. Thus FC networks are suitable for applications where real-time classification and adaptive filtering are needed. In this paper we show that FC networks are “hard- ware friendly” for implementation on FPGAs. Their unique prescriptive learning scheme can be integrated with the hardware design of the FC network through parameteriza- tion and compile-time constant folding. 1. Introduction There exist certain classes of real-time classification / adaptive control systems within which learning is a critical task which must be guaranteed to finish on time. Reconnais- sance robots, and satellite sensory systems looking for inter- esting objects in unfamiliar environments are some of the examples of such real-time systems. In these types of sys- tems, one cannot fully anticipant the full range of objects the classification systems may encounter. The classification system must learn to classify these in real-time and learn as they continue to explore. Neural networks have been shown to be powerful classification tools. However, neural net- works which are based on iterative learning algorithms such as multilayer perceptrons, radial basis functions and support vector machines can suffer from training bottleneck [1]: that is learning may not converge or take too long to be use- ful in real-time applications even with hardware accelera- tion. For this reason, iterative learning neural networks are not suitable for the real-time systems where learning is a critical task. Kak’s Fast Classification (FC) networks [2] overcome the learning bottleneck by employing instantaneous learn- ing. The model of FC networks is motivated by a biologi- cally plausible sketchpad mechanism for short-term memory in which learning occurs instantaneously. The learning in FC networks does not suffer from the learning bottleneck and is always guaranteed to converge. Both the weights and the topology of an FC network are determined by simple inspection of the training examples. Only two presentations of training samples are required to train an FC network, which is extremely efficient compared with itera- tive learning algorithms, such as backpropagation, where thousands of presentations of training samples are required. In this paper, we show that Kak’s FC networks with their prescriptive learning scheme are well suited for implemen- tation on FPGA based reconfigurable hardware platforms by exploiting fine grained parallelism. We show that the prescriptive learning algorithm can be integrated into hard- ware design for the FC networks through parameterization and compile-time constant folding. The remainder of this paper is organised as follows. Sec- tion 2 describes the algorithm framework for the FC net- works. Operations in the training and execution phases of FC networks are formally presented. Section 3 presents the hardware design for FC networks. The overall system archi- tecture is outlined first, followed by implementations for network components: hidden neurons, hidden layer rule- bases and the output neurons. Section 4 discusses strategies for integrating prescriptive learning with the design of FC networks and Section 5 draws some conclusions. 2. Algorithmic Framework for FC Networks The FC networks have a three layer feed-forward archi- tecture which consists of a layer of inputs, a layer of dis- An FPGA Implementation of Kak’s Instantaneously-Trained, Fast-Classification Neural Networks Jihan Zhu and Peter Sutton School of Information Technology and Electrical Engineering The University of Queensland Brisbane QLD 4072 Australia {jihan, p.sutton@itee.uq.edu.au}