In Part One I outline the topics in Think Complexity and contrasted a classical physical model of planetary orbits with an example from complexity science: Schelling's model of racial segregation.
In Part Two I outline some of the ways complexity differs from classical science. In Part Three, I describe differences in the ways complex models are used, and their effects in engineering and (of all things) epistemology.
Part Four pulls together discussions from two chapters: the Watts-Strogatz model of small world graphs, and the Barabasi-Albert model of scale free networks. And now, Part Five: Self-organized criticality.
In 1987 Bak, Tang and Wiesenfeld published a paper in Physical Review Letters, ``Self-organized criticality: an explanation of 1/f noise.'' You can download it from http://prl.aps.org/abstract/PRL/v59/i4/p381_1.
The title takes some explaining. A system is ``critical'' if it is in transition between two phases; for example, water at its freezing point is a critical system. A variety of critical systems demonstrate common behaviors:
- Long-tailed distributions of some physical quantities: for example, in freezing water the distribution of crystal sizes is characterized by a power law.
- Fractal geometries: freezing water tends to form fractal patterns---the canonical example is a snowflake. Fractals are characterized by self-similarity; that is, parts of the pattern resemble scaled copies of the whole.
- Variations in time that exhibit pink noise: what we call ``noise'' is a time series with many frequency components. In ``white'' noise, all of the components have equal power. In ``pink'' noise, low-frequency components have more power than high-frequency components. Specifically, the power at frequency f is proportional to 1/f. Visible light with this power spectrum looks pink, hence the name.
Critical systems are usually unstable. For example, to keep water in a partially frozen state requires active control of the temperature. If the system is near the critical temperature, a small deviation tends to move the system into one phase or the other.
Many natural systems exhibit characteristic behaviors of criticality, but if critical points are unstable, they should not be common in nature. This is the puzzle Bak, Tang and Wiesenfeld address. Their solution is called self-organized criticality (SOC), where ``self-organized'' means that from any initial condition, the system tends to move toward a critical state, and stay there, without external control.
As an example, they propose a model of a sand pile. The model is not realistic, but it has become the standard example of self-organized criticality.
The model is a 2-D cellular automaton where the state of each cell represents the slope of a part of a sand pile. During each time step, each cell is checked to see whether it exceeds some critical value. If so, an ``avalanche'' occurs that transfers sand to neighboring cells; specifically, the cell's slope is decreased by 4, and each of the 4 neighbors is increased by 1. At the perimeter of the grid, all cells are kept at zero slope, so (in some sense) the excess spills over the edge.
Bak et al. let the system run until it is stable, then observe the effect of small perturbations; they choose a cell at random, increment its value by 1, and evolve the system, again, until it stabilizes.
For each perturbation, they measure the total number of cells that are affected by the resulting avalanche. Most of the time it is small, usually 1. But occasionally a large avalanche affects a substantial fraction
of the grid. The distribution of turns out to be long-tailed, which supports the claim that the system is in a critical state.
[Think Complexity presents the details of this model and tests for long-tailed distributions, fractal geometry, and 1/f noise. For this excerpt, I'll skip to the discussion at the end of the chapter.]
Reductionism and Holism
The original paper by Bak, Tang and Wiesenfeld is one of the most frequently-cited papers in the last few decades. Many new systems have been shown to be self-organized critical, and the sand-pile model, in particular, has been studied in detail.
As it turns out, the sand-pile model is not a very good model of a sand pile. Sand is dense and not very sticky, so momentum has a non-negligible effect on the behavior of avalanches. As a result, there are fewer very large and very small avalanches than the model predicts, and the distribution is not long tailed.
Bak has suggested that this observation misses the point. The sand pile model is not meant to be a realistic model of a sand pile; it is meant to be a simple example of a broad category of models.
To understand this point, it is useful to think about two kinds of models, reductionist and holistic. A reductionist model describes a system by describing its parts and their interactions. When a reductionist model is used as an explanation, it depends on an analogy between the components of the model and the components of the system.
For example, to explain why the ideal gas law holds, we can model the molecules that make up a gas with point masses, and model their interactions as elastic collisions. If you simulate or analyze this model, you find that it obeys the ideal gas law. This model is satisfactory to the degree that molecules in a gas behave like molecules in the model. The analogy is between the parts of the system and the parts of the model.
Holistic models are more focused on similarities between systems and less interested in analogous parts. A holistic approach to modeling often consists of two steps, not necessarily in this order:
1. Identify a kind of behavior that appears in a variety of systems.
2. Find the simplest model that demonstrates that behavior.
For example, in The Selfish Gene, Richard Dawkins suggests that genetic evolution is just one example of an evolutionary system. He identifies the essential elements of the category---discrete replicators, variability and differential reproduction---and proposes that any system that has these elements displays similar behavior, including complexity without design. As another example of an evolutionary system, he proposes memes, which are thoughts or behaviors that are ``replicated'' by transmission from person to person. As memes compete for the resource of human attention, they evolve in ways that are similar to genetic evolution.
Critics of memetics have pointed out that memes are a poor analogy for genes. Memes differ from genes in many obvious ways. But Dawkins has argued that these differences are beside the point because memes are not supposed to be analogous to genes. Rather, memetics and genetics are examples of the same category---evolutionary systems. The differences between them emphasize the real point, which is that evolution is a general model that applies to many seemingly disparate systems. The logical structure of this argument is shown in this diagram:
Bak has made a similar argument that self-organized criticality is a general model for a broad category of systems. According to Wikipedia, ``SOC is typically observed in slowly-driven non-equilibrium systems with extended degrees of freedom and a high level of nonlinearity.''
Many natural systems demonstrate behaviors characteristic of critical systems. Bak's explanation for this prevalence is that these systems are examples of the broad category of self-organized criticality. There are two ways to support this argument. One is to build a realistic model of a particular system and show that the model exhibits SOC. The second is to show that SOC is a feature of many diverse models, and to identify the essential characteristics those models have in common.
The first approach, which I characterize as reductionist, can explain the behavior of a particular system. The second, holistic, approach, explains the prevalence of criticality in natural systems. They are different models with different purposes.
For reductionist models, realism is the primary virtue, and simplicity is secondary. For holistic models, it is the other way around.
I am using "reductionism" and "holism" here is a descriptive sense, not as technical labels for these models. For more general discussion of these terms, see http://en.wikipedia.org/wiki/Reductionism and http://en.wikipedia.org/wiki/Holism.
SOC, causation and prediction
If a stock market index drops by a fraction of a percent in a day, there is no need for an explanation. But if it drops 10, people want to know why. Pundits on television are willing to offer explanations, but the real answer may be that there is no explanation.
Day-to-day variability in the stock market shows evidence of criticality: the distribution of value changes is long-tailed and the time series exhibits noise. If the stock market is a self-organized critical system, we should expect occasional large changes as part of the ordinary behavior of the market.
The distribution of earthquake sizes is also long-tailed, and there are simple models of the dynamics of geological faults that might explain this behavior. If these models are right, they imply that large earthquakes are unexceptional; that is, they do not require explanation any more than small earthquakes do.
Similarly, Charles Perrow has suggested that failures in large engineered systems, like nuclear power plants, are like avalanches in the sand pile model. Most failures are small, isolated and harmless, but occasionally a coincidence of bad fortune yields a catastrophe. When big accidents occur, investigators go looking for the cause, but if Perrow's ``normal accident theory'' is correct, there may be no cause.
These conclusions are not comforting. Among other things, they imply that large earthquakes and some kinds of accidents are fundamentally unpredictable. It is impossible to look at the state of a critical system and say whether a large avalanche is ``due.'' If the system is in a critical state, then a large avalanche is always possible. It just depends on the next grain of sand.
In a sand-pile model, what is the cause of a large avalanche? Philosophers sometimes distinguish the proximate cause, which is most immediately responsible, from the ultimate cause, which is, for whatever reason, considered the true cause.
In the sand-pile model, the proximate cause of an avalanche is a grain of sand, but the grain that causes a large avalanche is identical to any other grain, so it offers no special explanation. The ultimate cause of a large avalanche is the structure and dynamics of the systems as a whole: large avalanches occur because they are a property of the system.
Many social phenomena, including wars, revolutions, epidemics, inventions and terrorist attacks, are characterized by long-tailed distributions. If the reason for these distributions is that social systems are critical, that suggests that major historical events may be fundamentally unpredictable and unexplainable.
[Think Complexity can be used as a textbook, so it includes exercises and topics for class discussion. Here are some ideas for discussion and further reading.]
1. In a 1996 paper in Nature, Frette et al report the results of experiments with rice piles (http://www.nature.com/nature/journal/v379/n6560/abs/379049a0.html). They find that some kinds of rice yield evidence of critical behavior, but others do not.
Similarly, Pruessner and Jensen studied large-scale versions of the forest fire model (using an algorithm similar to Newman and Ziff's). In their 2004 paper, ``Efficient algorithm for the forest fire model,'' they present evidence that the system is not critical after all (http://pre.aps.org/abstract/PRE/v70/i6/e066707). How do these results bear on Bak's claim that SOC explains the prevalence of critical phenomena in nature?
2. In The Fractal Geometry of Nature, Benoit Mandelbrot proposes what he calls a ``heretical'' explanation for the prevalence of long-tailed distributions in natural systems (page 344). It may not be, as Bak suggests, that many systems can generate this behavior in isolation. Instead there may be only a few, but there may be interactions between systems that cause the behavior to propagate.
To support this argument, Mandelbrot points out:
- The distribution of observed data is often ``the joint effect of a fixed underlying 'true distribution' and a highly variable 'filter.'''
- Long-tailed distributions are robust to filtering; that is, ``a wide variety of filters leave their asymptotic behavior unchanged.''
What do you think of this argument? Would you characterize it as reductionist or holist?
3. Read about the ``Great Man'' theory of history at http://en.wikipedia.org/wiki/Great_man_theory. What implication does self-organized criticality have for this theory?