Seven scientists are convicted in Italy for not predicting an earthquake well enough. Reporters sued for not predicting the weather incorrectly. A strange turn of events tied to a fundamental popular misunderstanding of complexity.
The human mind adores simple rules. When things are complicated, we like to use analysis to break separate parts out, and solve them as separate problems. Thanks to Newton and Descartes, reductionists believe that all problems can be solved this way if you are patient enough to break large problems into enough pieces. It is reassuring to think that all things are knowable.
In stark contrast to this is idea that many phenomena are complex. That is, the internal dependencies are such that no part operates independently of the others. That is, you can not isolate a part of the problem and solve that independently of the other other parts. The weather. Earthquakes. An ecosystem. The marketplace. A social network. A business. And many more. They are all around us, but most people are blind to the implications.
In Italy scientists have been convicted for ‘false assurances’ before before earthquake. They reported that the series of earthquakes experienced were consistent with patterns normally seen, and a bigger quake unlikely. But earthquakes are inherently chaotic, and soon after that an unlikely record breaking 6.3 magnitude quake hit the town built mostly of un-reinforced masonry, and more than 300 people died. A tragedy to be sure, but is this the fault of the geophysicists?
Earthquakes are inherently unpredictable. Chaos theory give this a name: the butterfly effect which is also known as “sensitive dependence upon initial conditions” which refers to way that very very small perturbations can, over time, make a tremendous change in the outcome. Errors in reading the initial conditions do not smooth out and go away as time progresses (as would be expected from an Enlightenment view of the world) but instead these errors build up over time. This is not a flaw with the prediction model, but actually an intrinsic quality of the world. Some might consider it a success of science that so many people believe in the power to predict earthquakes to such an extent that the inability to predict one would be considered a crime.
Perhaps the field of geophysics could have been more vocal about the inability to predict anything about earthquakes — but that is misleading as well. There are certain trends that can be statistically predicted: i.e. California will have so many earthquakes averaging a certain magnitude over a period of years. The science can even be more helpful in predicting which parts of the land will be more or less effected on the average, or how to prepare in general for earthquakes. This is critically important. But at the same time there is no ability to predict a single event accurately to high precision. How can this make sense?
This idea of “unpredictability in the midst of relative stasis” is very hard for anyone to understand. Nicholas Nassim Taleb has written several books, most notably “Fooled by Randomness” where he takes wall street denizens to task for inventing narratives to explain the price swings in the stock market, when in fact those price swings have all the hallmarks of chaos. Finding explanations for observed behaviors is a survival skill that the human mind does naturally. Concluding that a behavior is “simply randomness” can be deeply disturbing, and most people reject it.
If the public at large is unwilling to accept that certain things are inherently random, what is next: suing the weatherman when he predicts incorrectly? In another story we find that a Belgian town may sue over soggy Weather Forecasts when the weather in fact turned out to be nicer than normal in some parts. People simply have an unjustified faith in the ability for scientists to predict the weather.
Human organizations and businesses also complex (a.k.a wicked problem). Certain macro trends can be seen in advance (e.g. approximately the number of people who will buy smart phones this month) and still details can be entirely unpredictable (e.g. whether a particular customer prefer an iPhone 5 or a Samsung Galaxy). The fact that general trends can emerge tends to convince people that with good enough models and powerful enough computers the entire market could be “solved” and laid out as a simple business process. We need to remember that complexity comes hand-in-hand with unpredictability.
One reason people find unpredictability so hard, is the assumption that a system can only “make” something simpler than itself. To make a complicated car takes a much more complicated factory. This seems intuitive, but Stephen Wolfram, in his book “A New Kind of Science” shows that incredibly complex patterns can emerge from incredibly simple calculations. I have literally illustrated this concept with the Julia Set (fractal) decorating this blog; an incredibly complex pattern that results from fundamentally a very simple calculation. Complex systems surround us, but without understanding them, we are blind.
If you want to know more about complexity, I can recommend an excellent book on the subject: in “Complexity: A guided tour” (see my review) Melanie Mitchell starts at the beginning and explains how all complexity result from iteration: weather systems are masses of air molecules repeatedly interacting, ecosystems have large numbers of organisms all interacting and competing, stock traders constantly trying to outwit each other, etc. The ability to explore complexity required computing systems. She explains Gödel’s Theorem, Turing’s Machine, and Wolfram’s Rule 110. I can’t recommend this book highly enough for anyone wanting to grasp the basics. Well written, non-mathematical treatment of an admittedly esoteric topic.
One can only hope for a better appreciation of unpredictability, because as long as there are misunderstandings about complexity, as long as the public believes that everything in the world is ultimately calculable, there will continue to be a high risk for any scientist working with complex phenomena that they will be blamed when the unpredictable happens.
Comments from a different thread:
My only comment would be that there is an element of predictability to earthquakes, and the danger was higher at the time of the L’Aquila earthquake due to the ongoing swarm, but only perhaps a factor of 100 or 1000 higher, not nearly enough to warrant serious preventative actions.
In essence, the situation was much less predictable than the people suing think (not completely unpredictable), as you describe (and the actions of the scientists are misinterpreted most uncharitably).
Have to find a way to say that: not completely unpredictable, but only 0.1% predictable? There is always “an element of predictability”.
Not just that something is likely or unlikely. Or extremely unlikely. But rather that it is mostly unlikely to be predicted. Can’t figure out a better way than to say “unpredictable” on the basis that predictable would be 100% (or at least >50%) and unpredictable is less than that.
One term we use is “probability gain”. We can gain a factor of 10 to 1000 by watching background activity, but usually need to reach 10^^5 to 10^^7 to be really useful.
Hi Keith, thanks for reiterating the subject of complexity. It can’t be said often enough. Unfortunately it is a simple concept that comes close to a theory of everything, but it can’t be made understood and accepted in simple words. Which why my 2011 post on the subject was even longer. http://isismjpucher.wordpress.com/2011/07/06/the-complexity-of-simplicity/
The widespread error is that something is complex because we lack information or understanding. This similar to the misinterpretation of Heisenberg’s Uncertainty Principle in the sense that the inability to measure accurately is caused by unwantingly influencing the measurement. Yes, we always influence what we measure too, but that is not the problem. Quantum physical phenomenon such as a simple photon can’t be located exactly while measuring its momentum. Which leads to the probability phenomenon you mentioned.
All natural systems are complex in itself and not just because we can’t understand them. That fallacy is also present in the BIG DATA illusion. Even if we gather all the possible data about a natural complex adaptive system (i.e. a business or the economy) we can’t predict its behavior or interaction. We can predict larger statistical outcomes because the Gauss distribution is a natural phenomenon too, but we can’t predict the future of a single entity within it or how the system will change over time.
It is the human fear of unpredictability that led to the flow-diagram fallacy. Business processes must be considered in the sense of customer perception (customer experience is now a buzzword) but that is hardly ever considered in BPM. Even in the ‘Outside-In’ approach, an assumption is made what the customer should get at the end of the process rather than truly understanding what he perceives. Plus that we are not able to control the outcome upfront. Control is only possible to some extent in laboratory or factory environments. And even there rigid process becomes less important and more and more real-time control is applied to deal with unpredictability. Just in BPM – and that includes the Gartner concept of iBPMS – we continue to enforce the idea of predictability … it is quite ridiculous and ignorant actually!
Thanks Max. Agree of course. I like diagram not as a model of the world, but as a model of how I want to talk about the world to another. It is not a model of “what is” but rather a model of the rather limited conception that I use to guide activity. The problem comes when people fail to understand the distinction, and they take the model to be a representation of reality.
first of all: great post and lesson on unpredictability.
I share the appalled reactions of the scientific and international community to the decision of the judges on this case, as this will lead to an empasse for scientists that won’t do ANY prediction in the future, even when predictability is sensible (or MORE sensible, in line to the comment you cross-posted).
I just wanted to add one bit of information, which is not much in line with predictability of physical models, but it may put in a different light the whole story (and still teach something on complex system).
As you may imagine, a wild debate is now ongoing in Italy on this issue. And besides the pure statements of scientists, experts and politicians are raising several other questions:
1. what was the role of the scientists?
2. who asked them to speak out? why?
3. were their sentences misunderstood or manipulated or at least used for second purposes?
4. is there any bias in a tribunal procedure with so large visibility and impact on the media?
All this boils down to some points that are not so marginal: first, the actual judge decision and motivations are not based on the scientific statements but on COMMUNICATION mistakes; second, the scientists (or at least their statements) had been possibly “manipulated” from other forces, with some specific purposes (and this may account in one single fault for them, that is they actually allowed to be manipulated).
Bottom line: there is a further level of complexity in the story, which is related to SOCIAL, POLITICAL, MEDIATIC, and PSYCHOLOGICAL aspects that should be taken into account.
This makes the whole system totally unpredictable (at least until Isaac Asimov’s Psychostory discipline will become reality, see http://en.wikipedia.org/wiki/Psychohistory_(fictional) J ).
I’m mentioning all this because I think that exactly the same issues apply into large organizations and international companies too, and this contributes to the difficulty of their governance.
Thanks Marco. It will be interesting to see how this plays out. I realize, of course, that the judges decision is about whether it was appropriate to assure people that there was no danger. In retrospect, that seems to be a mistake, but I have it on good authority that based on reasonable probability estimates, there was no justification for giving any warning. That is, based on the science, if the same sequence of quakes arrived this week, there still would be no justification for warning of an impending large quake.
But what will happen today is that warning will be issued even when they are not justified. Remember, even if there is only a 1 in a million chance, it still can happen. Nobody will ever say that you don’t need to evacuate, no matter how unlikely there is for an earthquake. People will get tired of evacuating for 1-in-a-million chances, and the warnings will be ignored, and people will still get killed.
The real action should be taken in times without earthquakes: today people should be tearing down those building that are still standing, and replacing them with properly reinforced building methods. (I know … such a shame … I love the old architecture, but not when they are deathtraps.) Allowing people to live in flimsy buildings is the real crime. But is that being discussed? I suppose it is, but I wonder if it really has the intensity it deserves.
About phychohistory: I was a huge fan of the Foundation trilogy when I was a boy. Now, however, I feel that the idea of phychohistory is exactly the fallacy that fails to understand how chaos works. Assimov can be excused considering the time he wrote it. I wish more people knew about those books because it would be a convenient way to explain the fallacy.