Psychological Bulletin Copyright 1991 by the American Psychological Association, Inc. 1991, Vol. 110, No. 2, 350-374 0033-2909/91/$3.00 Analyses of Multinomial Mixture Distributions: New Tests for Stochastic Models of Cognition and Action Steven Yantis Johns Hopkins University David E. Meyer and J. E. Keith Smith University of Michigan Mixture distributionsare formed from a weighted linearcombinationof 2 or more underlying basis distributions[g(x) = Ziajfj(x); Y~ aj = 1 ]. They arise frequently in stochastic models of perception, cognition, and action in which a finite number of discrete internal states are entered probabilisti- caUy over a series of trials. This article reviews various distributional properties that have been examined to test for the presence of mixture distributions. A new multinomialmaximum likeli- hood mixture (MMLM) analysis is discussed for estimating the mixing probabilities aj and the basis distributions fl(x) of a hypothesized mixture distribution. The analysis also generates a maximum likelihood goodness-of-fit statistic for testing various mixture hypotheses. Stochastic computer simulations characterize the statistical power of such tests under representative condi- tions. Two empirical studies of mental processes hypothesized to involvemixture distributionsare summarized to illustrate applications of the MMLM analysis. Throughout many areas of psychology, stochastic models with a combination of systematic and random components have been proposed to characterize important aspects of hu- man behavior (e.g., Coombs, Dawes, & Tversky, 1970; Luce, Bush, & Galanter, 1963; Townsend & Ashby, 1983). The under- lying components postulated by these models often correspond to a finite set of discrete probabilistic mental or physical states (e.g., levels of expectation, preparation, and motivation). Given this correspondence, a subject's observed performance (e.g., the identities, latencies, and magnitudes of overt responses) may fluctuate in a relatively simple fashion across successive trials of an experiment, depending on which state or states the subject occupies during the course of each trial. Specifically, suppose the following: (a) There is a random variable X, the values of which repre- sent some behavioral or psychophysiological measure (e.g., reac- tion time; RT) taken on a trial-by-trial basis. Portions of this research were first reported at the 15th annual meet- ing of the Society for Mathematical Psychology,Princeton, New Jersey (Smith, Meyer, Yantis,& Osman, 1982). Preparation of this article was supported by National Institute of Mental Health Grants R01-MH43924 to Johns Hopkins University, Steven Yantis,principal investigator,and R01-MH38845 to the Univer- sity of Michigan, David E. Meyer, principal investigator. We thank Allen Osman for helpful advice, Carol Huff for superb technical assistance, and Toby Mordkoff for computer programming. Saul Sternberg, John Theios, and several anonymous reviewers pro- vided valuable comments on previous versions of the manuscript. Correspondence concerningthis article should be addressed to Ste- yen Yantis,Department of Psychology,Johns Hopkins University, Bal- timore, Maryland 21218 (electronic mail may be sent to yan- tis@jhuvms), or to David E. Meyer and J. E. Keith Smith, Department of Psychology,Universityof Michigan, 330 Packard Road, Ann Arbor, Michigan 48104. (b) On each trial, the subject enters a particular state sj, se- lected probabilistically from a finite set of J possible states {st, s2..... sA. (c) The probability of entering state sj is aj, where 0 < o9 < 1, and Z~aj= 1. (d) The value of Xobtained for a particular trim depends only on which state sj is occupied. (e) Each state sj has associated with it a probability-densityor probability-mass function, fj(x), that characterizes the distri- bution of X when sj is occupied. Then we can express the distribution of X over all trims in terms of a mixture distribution that has a probability density function g(x) formed from a linear combination of the individ- ual functions fj(x), weighted by the probabilities a~ of the re- spective states sj. Formally, J g(x) = Z ~jfj(x). ( 1 ) j=l Here the fj(x) are called basis distributions, and the aj are called mixing probabilities. In effect, the magnitudes of the mixing probabilities determine the relative contribution made by each basis distribution to the overall mixture distribution. Across the conditions of an experiment, these probabilities may change systematically. Thus, if an experiment includes Kdiffer- ent conditions, it may yield a family of Kdistinct mixture distri- butions gk(X) (k = 1, 2 ..... K), where each member of the family is formed from the same basis distributions fj(x) ( j = 1, 2,..., J) but has its own unique set of mixing probabilities An important special case of Equation I is the binary mixture distribution, g(x) = aft(x) + (1 - a)f2(x). (2) Here, J = 2 and a = a~ = 1 - a2. This case arises under condi- tions in which there are only two alternative states, s~ and s2, 350