The dynamics of a system may be linear or nonlinear. Linear dynamics involves linear mathematical operators, which have the property that their action on a sum of two functions is equal to the sum of the action of the operator on each function.
Nonlinear dynamics means that the output is not linearly proportional to the input. Consider a system described by the equation y(t) = c x(t)2. If you double the input x, the output y does not double; it becomes four times. But if y(t) = c x(t), we are dealing with linear dynamics. Linearity respects the principle of superposition: If x1(t) and x2(t) are two solutions of an equation, then c1x1(t) + c2x2(t) is also a solution, where c1 and c2 are arbitrary constants. By contrast, nonlinear operators are such that, e.g., (x1(t) + x2(t))2 ≠ x1(t)2 + x2(t)2.
In science, chaos is a technical term, signifying highly nonlinear dynamics, although not all nonlinear systems are chaotic. Apart from nonlinearity (or rather, because of strong nonlinearity), a chaotic system is characterized by unpredictable evolution in space and time, in spite of the fact that the differential equations or difference equations describing it are deterministic (if we can neglect 'noise').
Chaotic phenomena are everywhere. According to Kaneko and Tsuda (2000): 'In fact, chaos exists everywhere in our lives. We are tempted to imagine that if chaos is such an ordinary phenomenon, perhaps humans discovered chaos and defined the concept in ancient times. Actually, in many mythical stories chaos is described as a state in which heaven and earth are not divided (a state in which everything is mixed). Chaos is also described as an "energy body" responsible for the creation of heaven and earth'.
In ancient Indian philosophy, the concept of Brahmman was propounded for this 'energy body', and the 'state in which everything is mixed' is referred to as Pralaya. Chinese or Indian, these are just interesting old ideas, and have nothing to do with the science of chaos.
In the language of algorithmic information theory (cf. Part 23), chaos has the largest (but not infinite) degree of complexity. By contrast, random or noisy systems have an infinite degree of complexity by this definition.
Chaos theory is about finding the underlying order in apparently random data. It is about a certain class of nonlinear dynamical systems which may be either 'conservative' or 'dissipative'. Conservative systems do not lose energy over time.
A dissipative system, by contrast, loses energy; e.g. via friction. As a consequence of this, it always approaches some limiting or asymptotic configuration, namely an attractor in phase space.
The unpredictability feature of chaotic systems is because of extreme sensitivity to initial conditions. This is popularly referred to as the Butterfly effect:
The flapping of a single butterfly's wing today produces a tiny change in the state of the atmosphere. Over a period of time, what the atmosphere actually does diverges from what it would have done. So, in a month's time, a tornado that would have devastated the Indonesian coast doesn't happen. Or maybe one that wasn't going to happen, does (Ian Stewart).
Poincaré, near the end of the 19th century, had recognized the existence of chaos. The mathematics of celestial motion yielded complex solutions when the number of bodies was greater than as little as 3. But his work on chaos was not noticed, as also the work of several other scientists in the early 20th century. It was the meteorologist Edward Lorenz who in 1961 established the field of chaos theory as we know it today. Even his work went unnoticed for nearly a decade.
Lorenz was working on the problem of weather prediction. He started with the standard equations of fluid dynamics and simplified them greatly for carrying out his computer-simulation studies regarding the dynamics of the atmosphere. The remarkable discovery he made, by accident, was that the predictions his model made depended in a crucial way on the precision with which he specified the values of the three adjustable parameters in his equations. Two calculations, identical except that they differed in the value of one of these control parameters by, say, 0.000001, made totally different long-term predictions. The dynamics was not just nonlinear; even the slightest variations in the initial conditions gave wildly different results after a certain number of time steps.
Further investigations led to the discovery of what is now known as the Lorenz attractor . It was found that, although the results of the calculations were very sensitive to the values of the three control parameters, in every case the output always stayed on a double spiral in phase space.
This was new science. Till then only two types of dynamical systems were known: Those in steady state, and those in which the system undergoes periodic motion. But a chaotic system does not settle to a single-point attractor or a closed-loop attractor in phase space, although it is not random dynamics either. There is order, except that the phase-space trajectory never comes to the same point again. First one spiral is traced, and then the other, and then again the first (by a different path); so on.
The Lorenz attractor shown above belongs to a new family, called strange attractors. Such attractors have the 'self-similarity' feature: They have 'fractal' structure, meaning that the dimension of the structure is a fraction, rather than an integer. The fractional nature of the dimension is why the term 'strange' attractor is used.
I shall discuss fractals in the next post.
Dennis the Menace
Dennis: I want a job where I don't have to be right all the time.Friend: You want to be a weatherman?