T he successes of the standard model of electromagnetic, weak, and strong interactions have been remarkable. Neverthe- less, this model contains many as- sumptions and undetermined para- meters that are displeasing aesthet- ically. Also, the model has not yet produced a satisfying route to uni- fying gravity with the other three fundamental forces. The goal of particle physicists now is to discover where the stan- dard model fails and so to find clues to a better theory, perhaps ultimately achieving Einstein’s dream of unify- ing all forces including gravity. It is hoped that the clues will emerge from highly sophisticated experi- ments in which particles are acceler- ated and smashed together at very high energies. To date, however, no deviations from the standard model have been found, and so the search for new clues must be performed at even higher energies—such as those that will be achieved at the Super- conducting Super Collider (SSC) now under construction in Texas. Much theoretical analysis is still needed to interpret the results of these extraordinary and expensive experiments. The strong-interaction part of the standard model—quan- tum chromodynamics, or QCD— presents the major computational stumbling block. Qualitatively, QCD has all the right properties, but so far theoretical physicists have not been able to extract accurate predic- tions from this precise mathematical model with the traditional tools of the theoretical physicist—pencil and paper. To obtain reliable self-con- sistent results when dealing with the strong force between, say, two pro- tons requires calculating many sub- processes involving quarks and glu- ons. In fact, the number of sub- processes is so large that the calculation far exceeds the scope of analytical techniques. The solution is to turn to a new tool: the supercomputer. Large- scale numerical simulations of QCD are the most promising tech- nique for analyzing the strong in- teractions. In order to solve QCD on a computer one has to approxi- mate space and time by a four-di- mensional grid, or lattice, of points. The discretized version of the theory is called lattice QCD. Experimentally measurable quanti- ties (such as the particle masses and the probabilities of specific transitions) are determined from a statistical average over quantum fluctuations in the quark and gluon fields. The fluctuations at each po- sition in the lattice are simulated by a Monte Carlo procedure, so each Monte Carlo calculation de- termines one state in a statistical sample of possible states of a sys- tem. Monte Carlo methods are an efficient way of sampling the im- portant states, that is, states that give the dominant contributions to the process. The best Monte Carlo calculations to date have used lat- tices of size up to 32 3 × 48 and generated only a small statistical sample (twenty to fifty of the pos- sible states). The three sources of errors in such simulations are the lattice size, the lattice spacing, and the limited statistical sample. These errors can be systematically reduced by making the lattice size larger, the statistical sample larger, and the lattice spacing smaller. To reduce statistical and system- atic errors to the level of a few per- cent requires a computer with a very large memory and a very high operating speed, over 1000 billions of arithmetic operations per second. For comparison, a typical state-of- the-art home computer has a few million bytes of memory and runs at a few million operations per sec- ond. The required technology is just beginning to appear in the form of the parallel supercomputer. In fact, scientists interested in solving the riddle of QCD have played a significant role in the development of parallel supercomputers. The basic principle of these new ma- chines is simple—thousands of 153 1993 Number 21 Los Alamos Science Unification of Nature’s Fundamental Forces/ Testing the Standard Model Testing the Standard Model of Particle Interactions Using State-of-the-Art Supercomputers Rajan Gupta