insight review articles
268 NATURE | VOL 410 | 8 MARCH 2001 | www.nature.com
N
etworks are on our minds nowadays.
Sometimes we fear their power — and with
good reason. On 10 August 1996, a fault in
two power lines in Oregon led, through a
cascading series of failures, to blackouts in 11
US states and two Canadian provinces, leaving about 7
million customers without power for up to 16 hours
1
. The
Love Bug worm, the worst computer attack to date, spread
over the Internet on 4 May 2000 and inflicted billions of
dollars of damage worldwide.
In our lighter moments we play parlour games about
connectivity. ‘Six degrees of Marlon Brando’ broke out as a
nationwide fad in Germany, as readers of Die Zeit tried to
connect a falafel vendor in Berlin with his favourite actor
through the shortest possible chain of acquaintances
2
. And
during the height of the Lewinsky scandal, the New York
Times printed a diagram
3
of the famous people within ‘six
degrees of Monica’.
Meanwhile scientists have been thinking about
networks too. Empirical studies have shed light on the
topology of food webs
4,5
, electrical power grids, cellular and
metabolic networks
6–9
, the World-Wide Web
10
, the Internet
backbone
11
, the neural network of the nematode worm
Caenorhabditis elegans
12
, telephone call graphs
13
, coauthor-
ship and citation networks of scientists
14–16
, and the
quintessential ‘old-boy’ network, the overlapping boards of
directors of the largest companies in the United States
17
(Fig. 1). These databases are now easily accessible, courtesy
of the Internet. Moreover, the availability of powerful
computers has made it feasible to probe their structure;
until recently, computations involving million-node
networks would have been impossible without specialized
facilities.
Why is network anatomy so important to characterize?
Because structure always affects function. For instance, the
topology of social networks affects the spread of informa-
tion and disease, and the topology of the power grid affects
the robustness and stability of power transmission.
From this perspective, the current interest in networks is
part of a broader movement towards research on complex
systems. In the words of E. O. Wilson
18
, “The greatest
challenge today, not just in cell biology and ecology but in all
of science, is the accurate and complete description of
complex systems. Scientists have broken down many kinds
of systems. They think they know most of the elements
and forces. The next task is to reassemble them, at least
in mathematical models that capture the key properties of
the entire ensembles.”
But networks are inherently difficult to understand, as
the following list of possible complications illustrates.
1. Structural complexity: the wiring diagram could be an
intricate tangle (Fig. 1).
2. Network evolution: the wiring diagram could change
over time. On the World-Wide Web, pages and links are
created and lost every minute.
3. Connection diversity: the links between nodes could
have different weights, directions and signs. Synapses in
Exploring complex networks
Steven H. Strogatz
Department of Theoretical and Applied Mechanics and Center for Applied Mathematics, 212 Kimball Hall, Cornell University, Ithaca,
New York 14853-1503, USA (e-mail: strogatz@cornell.edu)
The study of networks pervades all of science, from neurobiology to statistical physics. The most basic
issues are structural: how does one characterize the wiring diagram of a food web or the Internet or the
metabolic network of the bacterium Escherichia coli? Are there any unifying principles underlying their
topology? From the perspective of nonlinear dynamics, we would also like to understand how an enormous
network of interacting dynamical systems — be they neurons, power stations or lasers — will behave
collectively, given their individual dynamics and coupling architecture. Researchers are only now beginning
to unravel the structure and dynamics of complex networks.
Dynamical systems can often be modelled by differential
equations dx/dtǃv(x), where x(t)ǃ(x
1
(t), …, x
n
(t)) is a
vector of state variables, t is time, and v(x)ǃ(v
1
(x), …,
v
n
(x)) is a vector of functions that encode the dynamics.
For example, in a chemical reaction, the state variables
represent concentrations. The differential equations
represent the kinetic rate laws, which usually involve
nonlinear functions of the concentrations.
Such nonlinear equations are typically impossible to
solve analytically, but one can gain qualitative insight by
imagining an abstract n-dimensional state space with
axes x
1
, …, x
n
. As the system evolves, x(t) flows through
state space, guided by the ‘velocity’ field dx/dtǃv(x) like
a speck carried along in a steady, viscous fluid.
Suppose x(t) eventually comes to rest at some point
x*. Then the velocity must be zero there, so we call x* a
fixed point. It corresponds to an equilibrium state of the
physical system being modelled. If all small disturbances
away from x* damp out, x* is called a stable fixed point
— it acts as an attractor for states in its vicinity.
Another long-term possibility is that x(t) flows
towards a closed loop and eventually circulates around it
forever. Such a loop is called a limit cycle. It represents a
self-sustained oscillation of the physical system.
A third possibility is that x(t) might settle onto a
strange attractor, a set of states on which it wanders
forever, never stopping or repeating. Such erratic,
aperiodic motion is considered chaotic if two nearby
states flow away from each other exponentially fast.
Long-term prediction is impossible in a real chaotic
system because of this exponential amplification of small
uncertainties or measurement errors
Box 1
Nonlinear dynamics:
terminology and concepts
97
© 2001 Macmillan Magazines Ltd