On the biological plausibility of artificial metaplasticity learning algorithm Diego Andina a,n , Francisco J. Ropero-Pela ´ez b a Group for Automation in Signal and Communications, Technical University of Madrid, Spain b Center for Mathematics Computation and Cognition, Universidade Federal do ABC, Brazil article info Keywords: Artificial Neural Networks Artificial learning Neural Metaplasticity BCM rule abstract The training algorithm studied in this paper is inspired by the biological metaplasticity property of neurons. During the training phase, the Artificial Metaplasticity Learning Algorithm could be considered a new probabilistic version of the presynaptic rule, as during this phase the algorithm assigns higher values for updating the weights in the less probable activations than in the ones with higher probability. The algorithm is proposed for Artificial Neural Networks in general, although results at the moment have only been implemented and tested for Multilayer Perceptrons. Tested on different multidisciplinary applications, experiments show a much more efficient training, improving also Multilayer Perceptron results till the performance of the best systems in the state of the art, systems that usually are much more complex. & 2012 Elsevier B.V. All rights reserved. 1. Introduction Artificial Metaplasticity (AMP) term was first introduced by Andina et al. [1] for an Artificial Neural Network (ANN) of the Multilayer Perceptron type (MLP), referred as AMMLP. During the AMMLP training phase, the matrix weight W that models the synaptic strength of its artificial neurons is updated according to the probability of the input patterns and therefore of the corre- sponding synaptic activations. The concept of biological meta- plasticity was defined in 1996 by Abraham [2] and now is widely applied in the fields of biology, neuroscience, physiology, neurol- ogy and others [35]. The prefix ‘‘meta’’ comes from Greek and means ‘‘beyond’’ or ‘‘above’’. In neuroscience and other fields ‘‘metaplasticity’’ indicates a higher level of plasticity, expressed as a change or transformation in the way synaptic efficacy is modified. Metaplasticity is defined as the induction of synaptic changes, that depends on prior synaptic activity [3,5]. Metaplasticity is due, at least in part, to variations in the level of postsynaptic depolarisation that induce synaptic changes. These variations facilitate synaptic potentiation and inhibit synaptic depression in depressed synapses (and vice versa in potentiated synapses). The direction and the degree of the synaptic alteration are functions of postsynaptic depolarisation during synaptic activation. Upregulation, incrementing, reinforcement of synaptic efficacy, is termed long-term potentiation (LTP), whereas downregulation, decrementing inhibing, is known as long-term depression (LTD). LTP and LTD are believed to be fundamental to storage of memory in the brain and hence learning. The induction of synaptic changes in the levels of neural activity is explained [6] in Fig. 1. Metaplasticity can be represented as variations in curve elongation with respect to the level of activity and implies a rightward shift of the LTP threshold T M , M ¼ 0, 1, 2, ... , (where curves cut to horizontal axis), according to the time- averaged level of postsynaptic firing a M . More recent studies [22] showed that there are also LTD thresholds that diminish in the same circumstances. In summary, once synapses are positively primed (i.e. there is an increment in weight), the interval between thresholds broadens, thereby favouring subsequent synaptic depression. Understanding metaplasticity may yield new insights into how the modification of synapses is regulated and how information is stored by synapses in the brain [79]. Synaptic plasticity refers to the efficient modulation of infor- mation transmission between neurons, being related to the regulation of the number of ionic channels in synapses. Synaptic plasticity mechanisms involve both molecular and structural modifications that affect synaptic functioning, either enhancing or depressing neuronal transmission. They include redistribution of postsynaptic receptors, activation of intracellular signaling cascades, and formation/retraction of the dendrites [10]. The first model of synaptic plasticity was postulated by Hebb and it is known as the Hebb rule [11]. In Fig. 1, the effect of metaplasticity is illustrated. Each curve indicates the biological variation of synaptic weight, Do, respec- tive of the neurons activation frequency or postsynaptic activity. If postsynaptic activity is high, by metaplasticity property, the curve will move, reinforcing the LTP. In Fig. 1, for different values of the time-averaged level of postsynaptic firing a M , a family of Contents lists available at SciVerse ScienceDirect journal homepage: www.elsevier.com/locate/neucom Neurocomputing 0925-2312/$ - see front matter & 2012 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.neucom.2012.09.028 n Corresponding author. Tel.: þ34 913 364 277; fax: þ34 915 439 652. E-mail addresses: andina@gc.ssr.upm.es (D. Andina), francisco.pelaez@ufabc.edu.br (F.J. Ropero-Pela ´ ez). Please cite this article as: D. Andina, F.J. Ropero-Pela ´ ez, On the biological plausibility of artificial metaplasticity learning algorithm, Neurocomputing (2012), http://dx.doi.org/10.1016/j.neucom.2012.09.028 Neurocomputing ] (]]]]) ]]]]]]