|
||
Map-makers,
Explorers, and Tricksters: New Roles for Planning and Prediction in Nonlinear, Complex Systems |
||
Bibliography |
Regions of Nonlinear Amplification: Loss of Information and Unpredictability Chaos As is now well appreciated, one of the cornerstones of the complexity
revolution concerns nonlinearity. According to the physicist J. Bruce West (1985), the
success of linear reasoning formed the backbone of scientific models well into the
mid-twentieth century. This linear perspective assumed a one- way, non-reciprocal type of
causality, a proportion between input and output, a negligible environmental influence on
a system, and that systems would evolve predictably (as on the flat surface of the plan
that was mentioned above). On the other hand, discoveries of nonlinearity have radically
challenged each of these assumptions. We can see this by taking a look at one of the most
startling types of newly discovered nonlinearity, i.e., chaos. |
|
Chaos presents one of the most startling demonstrations of unpredictability in complex systems. Since chaotic systems show a degree of unpredictability more extensive than that found generally in complex systems, the recognition of bounds even to the unpredictability in chaotic systems will be applicable even more so to complex systems in general. The unpredictability of chaotic systems is the result of their property of sensitive dependence on initial conditions (SIC) which exponentially magnifies small differences or changes in initial conditions. This is the so-called Butterfly Effect where the tiny air currents produced by a butterfly flapping its wings in, say, Sierra Leone, can be hugely amplified leading to a thunderstorm weeks later in Brazil. If such a tiny event as a butterfly flapping its wings could have such a huge impact on a system, and the number of such tiny events happening in a large complex system is so enormous, then the predictability of future states in a chaotic system must be impossible. Indeed, mathematical theorems have proven that the unpredictability of a chaotic system will always exceed capacity of the fastest computer predicting future states of a chaotic system by calculations based on initial conditions (Ford, 1989). A way to understand chaos' characteristic of SIC is to first consider what initial conditions are and how they are measured. An initial condition is simply the current state of a system when it is being assessed or measured. Measurements of the initial conditions of the weather, for example, may include air temperature at sea level, air temperature at higher elevations, wind speed, humidity, and so on. Of course, any measurement at some initial point in time will strive to be as precise and accurate as possible. On a graph, this hoped-for, ideal precision of measurement of initial conditions would be captured by a clearly distinct point (see Figure 1 in Appendix B). But the fact is that every measurement of the initial conditions of any system will contain some degree of imprecision or inaccuracy because the measurers are fallible, the measuring instruments are fallible and the measurement accuracy will always be limited. For example, measurements of air temperature at sea level will only go as far as some specific decimal point: Fahrenheit 75.0093 degrees. The instrument just cannot go any further. But this means that the measurement when displayed on a graph will never be an exact point, but will instead always occupy a region around a point, this region being equivalent to the amount of inaccuracy of the measurement (again, see Figure 1 in Appendix A).
|
||
Unpredictability as the The Nonlinear Expansion of
Ignorance Because there can never be a perfectly accurate measurement or assessment of a system's initial condition, there will always be something about the system that, at the time it is measured, remains unknown, in other words, a degree of ignorance or missing information about the system (Ford, 1989). Now the fact that we will always remain ignorant to some degree about a system does not present a problem for predictability in a linear system. The reason is that a linear system does not expand the amount of ignorance we have at any initial condition but simply keeps this ignorance approximately the same. That is, a small magnitude of ignorance or missing information to start-out with will merely stay the same because such systems are not sensitive to initial conditions. In other words they do not amplify the initial imprecision. The linearity of the system guarantees that the amount of what we don't know about the system will remain pretty much the same. However, in a strongly nonlinear system such as found in chaos, the ignorance or missing information associated with imprecision of measurement or assessment will be "blown-up" by the system and to such a degree that our ignorance of the system will always exceeds our ability to predict future states of the system. Chaotic systems, therefore, are intractably unpredictable, at least as far as future states of the system are concerned (See Appendix A Figure 2). In chaotic systems, we become more and more ignorant as we project the current state into the future. That is, our projection of the future will have to be extremely general and imprecise. Consequently, trying to predict the future state of a chaotic system based on measurements of the initial condition is largely an exercise in futility. All it can yield is a very large and murky space of possibilities for future states of the system. From the point of view of a planner trying to prognosticate the future, each future state of an organization becomes farther and farther removed from the predictions based on the initial conditions. The point is not simply the obvious fact that we can't know everything, instead, it is that chaos exponentially amplifies every small lack of information at our disposal. In such systems, there can be no exact solution, no short cut to tell ahead of time a future state - you just have to watch as the system evolves. According to the computer scientist Ed Fredkin: "There is no way to knowing the answer to some question [a
nonlinear one] any faster than what's going on...(even God) cannot know the answer to the
question any faster than doing it" (quoted in Wright, 1990, p. 68). |
||
Whereas the assumption of linearity in traditional planning presents a picture of system evolution as if it were proceeding on a flat plane where there is a proportionality between input and output with no surprises ahead, in nonlinear amplification like in chaos, a small input is magnified into a very large output. This suggests that nonlinearity deforms the surface so much that our line of vision is obscured. In regard to a business or institution characterized by some degree of strong nonlinearity, any initial assessment will not be of much help in forecasting future states of the system. This holds true for assessments of the environment as well. No matter how sophisticated the tools for measuring or assessing environmental variables, if the environment is characterized by strong nonlinearities, the future will remain opaque. But all this talk about the expansion of our ignorance and the ensuing unpredictability in strongly nonlinear systems is not the whole truth being revealed in complexity research. Indeed, there are regions on the nonlinear and complex geography that are indeed unpredictable, but the good news is that the more we learn about nonlinear systems the more we know about limits to regions of unpredictability. Let's turn to some of the ways nonlinear, complex systems are proving to be predictable after all.
|
||
Next | Previous Copyright © 2001, Plexus Institute Permission |