Everything Computes Peter J Bentley Abstract The emergence of complexity in our universe is caused by generative processes that confound us. Collective behaviours going under names such as evolution, swarming, embryogenesis, thought, and emergence produce results that seem counterintuitive (or downright impossible) when the individual elements in the collective are examined. In this seminar I explore the causes of complex behaviour in general and suggest a new viewpoint based on a combination of systemics and computation. I suggest that all behaviour in the universe can be viewed as a form of computation – a systemic computation – which is in many respects faster and more efficient than the classical views of computation we use in traditional computer designs. By viewing behaviour, and specifically biological systems, as types of highly parallel, nested computations, we can begin to understand how complex forms and functions emerge. With this knowledge we might one day be able to exploit the computation of natural systems and use it for our own purposes. Introduction The most bizarre thing about the universe around us is that it exists. The second most bizarre thing is why it behaves in the way it does. We’re not much closer to providing a satisfactory answer to the first question, despite some imaginative theories. But we’re finding much better answers to the second question. The behaviour of entities in our universe seems relatively straightforward at a low level. Quarks, atoms, and molecules all interact according to (more or less) known properties, resulting in the familiar forms of matter and energy we observe every day. But when we look at more complex groups of molecules, it all starts to feel confusing. A virus is an astonishing molecule. A cell is a monstrously complicated group of molecules. An organism is a terrifyingly complex group of cells. A species is a remarkable group of organisms. An ecology is an surprising group of species. And so it goes on. The superlatives are easily used because our brains are not clever enough to grasp how a generative process can create such complexities on their own. How can a design exist without a designer? How can something physical exist without a builder? How can damage be repaired without a repairer? How can adaptation to change occur without an adaptor? Computation is changing our conceptions of what is and is not possible. Today we can demonstrate that all of these seemingly miraculous effects can occur inside our computers. We can show how collections of smaller components interact and generate extraordinary emergent patterns and behaviour. There is no need for a designer or ruler of these systems – they will generate their own behaviour by themselves. The main thing we have to do is make our computer act like a womb in which something akin to biology can grow. At this point we run into problems. A conventional computer may theoretically be able to compute everything that occurs in a biological system, but even our fastest computers struggle to calculate the interactions of two complex molecules. Somehow biology is much better at performing computations than our computers are. In fact, everything in the universe is much better at computing. Everything computes. But it doesn’t compute in the same way that our computers do. If we can understand what kind of computer we need in order to produce similar natural complexity to biological systems, we might just understand a little more about the kind of computation going on in the universe around us. Maybe we might learn something about why things behave in the way they do. Computers are not good at computing You might say that humankind has become a little obsessed with computers recently. The internet and imminent rise of ubiquitous computing will soon ensure a computer is incorporated into most devices around us (Gershenfeld, 2000). Today we have computers in unlikely places such as lightswitches, toothbrushes, radios, car engines, pens and pets. In the following decades these will only become more numerous and more integrated as wireless communication enables every device to communicate with its neighbours in dynamic local networks (Weiser,1993). But the obsession is an unhealthy one. Our computers are failing us. As computation becomes more dynamic and unpredictable, with viruses, incompatibilities, aging software and side-effects from legacy code, so reliability is being reduced. (During the writing of this document the word processor crashed seven times, and that is normal by today’s standards.) But worse than this, our computers are not capable of scaling to achieve the level of computation that we desire. As computational analysis, modelling and data mining for the biosciences becomes increasingly important (Fogel and Corne 2003), we are finding that even our supercomputers struggle to model or analyse behaviour – Moore’s Law cannot keep up with the data produced by advances in DNA synthesis and sequencing productivity (Carlson, 2003). As our ambitions to exploit biological processes in “intelligent” algorithms increase, we discover that the processing power required to evolve and develop solutions, or create immunity in a network, is prohibitive using