поделитесь публикацией с друзьями и коллегами
All kinds of forecasts and prognoses have become an important part of our life. Above all this concerns the safety of such complex systems as nuclear power stations, modem aircraft and ships, and so forth. Early warning about natural calamities, be it earthquakes or twisters, is likewise of much importance. The latest achievements of nonlinear dynamics, of the theories of risk control and self-organizing criticality have given us a handy tool for the development of the prognostication science. But on the other hand, we have obtained added proof of the temporal limitations of prognosis as well. This range of problems is studied at the M. V. Keldysh Institute of Applied Mathematics (RAS). Its two leading researchers-Deputy Director Georgi Malinetsky, Dr. Sc. (Phys. & Math.), and Sergei Kurdyumov, Corresponding Member of the Russian Academy of Sciences (RAS)-have this to say on the subject.
Up until the 1960s only two classes of processes were believed to be in existence. The first one comprised dynamic systems in which their future condition was rigorously determined by their past, i.e. it was quite predictable. Once you have a sufficient body of data on their past, you can look far into their future too, as it was argued by the French astronomer, mathematician and physicist Pierre Laplace (1749-1827), honorary member of the St. Petersburg Academy of Sciences. Or, in modem parlance, once you have good computers and an adequate database, you can predict anything... As to the other class of processes, their future did not depend on their past.
Only later, in the 1970s, did we realize there was yet another, third class of processes described in formal terms by dynamic systems. A very important class of processes which can be predicted but for a relatively short time ahead; next comes the hard statistics. A simple pendulum demonstrating dynamic randomness is a graphic example. But first, let's look into its design.
Two rigidly interconnected rods are pivoted at a definite distance from the base where a coil and cells, generating an electromagnetic field, are fixed. The rods are perpendicular to each other. The vertical rod (in our case, the pendulum) has a massive metal ball at its bottom end, and a smaller one at the top. The horizontal rod fixed in the middle has two hinges at both ends; beams with two small balls freely rotate on these hinges. If an electromagnetic energy momentum is applied, the vertical rod will start oscillating, with a 95 percent probability of these vacillations being non- periodic. But it's a big question as to the position of the little balls mounted on the hinges. A simplest linear model may be developed for this particular pendular design, one that will enable us to predict the location of the little balls five oscillations ahead. Using the available mathematical models, we can make reliable predictions for twenty oscillations of the big ball; but thereupon the position of the hinges will be unpredictable all the same.
Way back in 1963 the American physicist, Nobel prize winner Richard Feinman said we are essentially limited in our ability of making forecasts even in a world perfectly described by classical mechanics, and this is true of very simple objects, like our pendulum, too. The same year another American scientist, Edward Lorentz (subsequently elected to the USSR Academy of Sciences as foreign member), came to the conclusion that the sensitivity of various systems to their initial state induces a random state. And he asked the legitimate question: why modem computers, mathematical models and computational algorithms did not produce methods of obtaining reliable weather forecasts within a medium-range time framework? He suggested a simplest model of air convection, a phenomenon that plays an important role in atmospher-
ic dynamics. A computer study produced one essential result: dynamic randomness, that is nonperiodic motion in deterministic systems (the Lorentz system in our case)- systems whose future is a function of the past-has a limit to prognostication. For weather forecasting this limit is not above three weeks, and for the World Ocean, about a month.
In fact, every new basic theory opens up new possibilities and shatters old illusions in the same breath. Thus classical mechanics killed the hope of building a perpetual motion (perpetuum mobile) of the first kind; thermodynamics showed a perpetual motion of the second kind was not possible either. Quantum mechanics shows one cannot make a precise measurement of a microparticle's coordinate and pulse simultaneously. The theory of generalized relativity has shown there are constraints on the data transmission rate as well. Nonlinear dynamics, too, has dashed certain illusions. It tells us: there can be no global predictability even for simple systems either (like our pendulum). Yet this discipline has made it possible to see not only essential difficulties involved in prognostication but also its new opportunities. Now from an angle of mathematics: any dynamic system, no matter what it models, describes the motion of a point in the phase space.* The latter is characterized by dimensionality, that is to say, a number of quantities which should be pre-assigned for determining the state of a particular system. It doesn't matter for computer processing what these quantities stand for: say, the number of wolves and rabbits in a territory; or the percentage of electors voting for this or that candidate; or variables describing solar activity or a cardiogram. By this virtue generalized information models can be constructed. Such models, coupled with the latest achievements of scientific prognostication, have enabled attacking the problem of rare events, disastrous ones in particular-the problem of their description and prognosis.
Here it would be worth to look through plots describing the changes of characteristics of two complex hierarchically organized systems-the stock market and the tectonic fault- shortly before disaster. Well, these curves are described by one mathematical equation which we cannot solve as yet. This is still another proof of the relevancy of nonlinear dynamics concerning the universality of scenarios for randomness emerging from an orderly state.
Nonlinear dynamics and risk control are deeply interconnected. This is now seen in the paradoxical statistics of global disasters. Most different catastrophes have been studied from this angle, the tragedies of the Titanic, or the US spaceship Challenger, and of the Chernobyl nuclear power station,* among many others. Each of these 20th century dramas is connected with a long chain of cause- and-effect links, or with an unfavorable confluence of rare random factors, as it is stated in post-mortem studies. Now what is the mathematical view of such rough patches of "bad luck"?
All the way back in the early 19th century a German mathematician and honorary member of the St. Petersburg Academy of Sciences, Karl Gauss, found that a sum of independent, evenly distributed random values obeys a definite law. Accordingly, major deviations of such values are so rare that they can be neglected. This rule, known as the Gaussian (normal) distribution, now underlies the many engineering computations and technical standards.
But there is also another class of laws, that of exponential ones. In this case one cannot disregard significant deviations-now recall the statistics of earthquakes, floods, hurricanes, nuclear accidents, stock market crashes, damages caused by information leaks and the like.
When taking up this or that technical project, we consider several options. Say, back in the 15th century Christopher Columbus was one who organized his sea expeditions with meticulous care.* He and his crews weighed all the pros and cons, cast a balance of possible gains and losses, and only then did they make a decision. This rule of thumb was employed up until the 1950s for assessing many technical initiatives.
But that was not enough, what with the growing complexity of various objects. One could not predict that way all sorts of contingencies, even rare and hypothetical ones, but quite possible nonetheless. Some of these emergencies are handled by R&D organizations, the Ministry for Emergency Situations, and by other bodies. As to hypothetical accidents, one thought until recently they could be neglected in keeping with the Gaussian distribution law. For instance, the likelihood of an accident at an atomic power station covers a period often million years. That is only one accident is possible within this span. Well and good, but in this case we come to deal with exponential statistics and must be prepared for the worst anyway.
Risk control involves a variety of factors. One should know the danger and understand its mechanisms and, what is likewise important, one should be able to identify its harbingers as well. Now take for example the thrilling phenomenon of rigid (hard) turbulence discovered in plasma physics in the 1970s and then in other systems. Supposing we have some physical quantity (say, temperature) that varies over a small range in a random mode but- makes giant leaps now and then. In such model problems, which can be extrapolated to many real systems, one can detect harbingers, the early warning signals: nothing out of the way has occurred yet, no disaster is yet is sight; but the slowly changing variable tells us we have moved into a danger zone.
The risk control theory and its practical applications are incorporated in the Russian Federation's goald-oriented program to preclude emer-
* Phase space-in classical mechanics and statistical physics, a multidimensional space on the axes of which one plots values of generalized coordinates and pulses of all particles of a system.- Ed .
* See: N. Fudin, O. Tarakanov, "Chernobyl: Radiation, Stress, Rehabilitation", Science in Russia, No. 5, 1994 .-Ed.
* For more detail, see a series of articles in Science in Russia, No. 3, 1992.- Ed.
gency situations in the technogenic sphere and alleviate the consequences of such emergencies should they occur. The accent is on the prognosis of possible calamities and disasters because economically, eliminating the consequences of such catastrophes, often taking a toll of human life, involves tens and hundreds of times as high expenses. In this connection we should mention Academician Nikita Moiseyev's study where the aftermath of a global nuclear conflict has been calculated in all its gruesome statistics.
We have broached the subject of exponential laws. But what are they in particular? The answer is in a complexity paradigm and in an associated theory of self-organizing criticality Exponential dependencies are characteristic of many complex systems-say, tectonic faults or securities markets. These most different phenomena have one thing in common-cause and effect. One event entails another one, a third and so forth; this sequence of events affects the entire system. For instance, mutation ultimately changes a biological species and its ecological niche; in turn, there follow changes in the habitation medium of other species as well, and they have to adjust to new conditions. A transition to a new state of equilibrium as a result of an "avalanche of changes" may be a long time off.
A heap of sand is the simplest physical model of this kind. If you throw a grain of sand on top, it will either remain there or else roll down and precipitate an avalanche of so many other grains. It all depends. According to exponential statistics, the danger of an avalanche, in scientific lingo, hovers on the boundary between deterministic and probabilistic behavior. Or, as we now say, on the brink of randomness.
A study of complex systems prone to self-organizing criticality shows that they per se tend to a critical state in which avalanches of any scope are possible. Belonging to such systems, mind you, are the biosphere, human society, various type infrastructures, the military-industrial complex and many other systems. Therefore the theory of self-organizing criticality is of much importance in terms of risk control methods and adequate safeguards. For that matter many research centers worldwide are focussing on the complexity paradigm and spin-offs from it. In our country this problem is investigated by the M. V. Keldysh Institute of Applied Mathematics (Russian Academy of Sciences).
G. Malinetsky, S. Kurdyumov, Nelineynaya dinamika i problemy prognow (Nonlinear Dynamics and Problems a/Prognostication), Vestnik RAN, Vol. 71, No. 3, 2001
Prepared by Arkady MALTSEV
Постоянный адрес данной публикации:
LRussia LWorld Y G