VOLUME 72 28 FEBRUARY 1994 Analysis of Genetic Algorithms Using Statistical Mechanics NUMBER 9 Adam Priigel-Bennett and Jonathan L. Shapiro Department of Computer Science, University of Man, chester, Manchester, M18 9PL, United Kingdom (Received 11 June 1993) A formalism is developed for studying genetic algorithms by considering the evolution of the distribution of fitness in the population. The effects of selection on the population are problem independent. The forms»~m predicts the optimal amount of selection. Crossover is solved for a model problem — finding low energy states of the one dimensional Ising spin glass. The theory is found to be in good agreement with simulations. PACS numbers: 02.50. — r, 05. 50. +q Genetic algorithms (GAs) are search techniques for finding good solutions to hard problems [1,2]. They have been applied to problems as diverse as the traveling sales- man problem and the design of efficient aerofoils. Instead of making changes to a single solution, a population of so- lutions is evolved. Improvements are made by combining good solutions to produce (possibly) better ones. To understand how GAs work, and thus to optimize their performance, it is necessary to understand their dy- namics. Although a genetic algorithm can be described as a Markov chain and thus solved formally [3], the ef- fects of finite population make the transition probabil- ities very complicated. Consequently, this formulation has not yielded a predictive description of genetic dy- namics. Another approach would be to exploit the well- known relationship between stochastic dynamics and the statistical mechanics of disordered systems. This is the subject of this paper. Statistical mechanics has already been applied to the study of other genetic dynamics (e. g. , [4, 5]). However, in these previous studies the selection of the individual to reproduce is random; there is no notion of fitness, whereas in GAs an individual reproduces with a probability determined by its fitness. In addition, the mechanisms of genetic mixing in the previous works were simpler than the crossover of the GA. In this Letter we show that statistical mechanics can be used to predict the evolution of a GA. We show that selection can be understood in terms of Derrida's ran- dom energy model [6]. To elucidate other aspects we study a toy problem — finding low lying states of the one dimensional Ising spin glass. Here we just consider the two most important operations, selection and crossover. The techniques we have used can be readily extended to other problems and other GA operators. A fuller discus- sion will be given elsewhere [7]. The techniques developed here are useful in extending simple genetic and population models to include selec- tion and crossover. In addition, they could have practi- cal benefits to those applying GAs in optimization. This formulation predicts the evolution of the GA in terms of the amount of selection and other operators; this knowl- edge could help find the optimal values of selection and crossover. As an example, many investigators have found that increasing the degree of selection as the population evolves considerably improves performance [8]. The sta- tistical mechanics formulation predicts the optimal value of selection in terms of properties of distribution of fit- ness, and does show that this optimum should increase during evolution. By studying a problem with well defined statistical properties the techniques of statistical mechanics can be used to calculate the behavior of a typical sample. The toy problem we examine is that of finding low lying states of a one dimensional spin glass with random nearest- neighbor couplings J; drawn from a Gaussian distribu- tion with zero mean and unit variance. The energy for a configuration of spins, S = (Si, Sq, ... , SN+i) is N E(S) = — ) J;S;S;ii. The ground state energy is E;„= — p, ] J;~. Although OO31-9007/94/72(9)/1 305(5)S06. 00 1994 The American Physical Society l305