Continuing the pattern from my past few post on Antifragile concepts, today consider Naive Intervention, that idea assuming that simple model actually represents a complex system can lead to disastrously bad decisions.
This subject was covered adequately by Max Pucher in his three part series: (1) From Antifragile to Models Behaving Badly, (2) It is a Matter of Survival, not Efficiency, and (3) Illusions of Predictability in Investment Theory and BPM. I will just give a short summary here.
The entire discussion is rooted in complexity theory: systems can be complex. The overwhelming need to come up with a narrative to explain what is happening in such a system, drives people to use models as a way of explaining what is going on.
Emanuel Derman’s book ‘Models Behaving Badly – Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life‘ goes into a lot of detail about the distinction between a scientific law, a theory, and a model — concepts that most of us blur in everyday life. A model is an illusion, and it is not reality. Most of the book explains how financial models are essentially fictions about a complex system yielding very inaccurate and largely untested predictions. They serve a purpose in allowing us to understand some aspects of the behavior of the financial market, but confusing these this illusions with reality can lead to disaster. Models are not theories, like the Dirac theory of an electron, which is able to yield results that are accurate to more than 10 digits of precision. A model can be helpful in understanding some behavior, but can also be completely incorrect when the situation is outside of certain parameters. No financial model can reliably predict the closing price of AAPL stock one week from today — not even to one digit accuracy!
The warning is apt in the work management field because human organizations are complex systems, just like financial markets. Many of the models that describe how people work are just as fictional as the financial models. For example, the fundamental assumption behind most economic models is that people will decide to do thing (e.g. go to a Broadway show) based on a cost vs. benefit analysis of all the factors. Real life is both much more complex than that, and at the same time often much more basic. We know that success in most important jobs depends upon something we call caring, and yet we have no measure for the amount of caring that person bring to the job.
In BPM we work with models as well, but success depends upon what you are modeling. If you have a model of how you are going to on-board a new telephone account holder, then you have more or less complete freedom on how to do this. It is a routine, straight-through process, and a model works adequately. But if your model is about how human patients, or public opinion, or social work situations, then watch out. In these complex situations they simplify too much and bring about naive intervention.
The problem appears when we make decisions based on those simplified models. There may be a satisfying narrative about why a particular change might improve productivity, but never forget that it is just based on a model. Taleb, in the book Antifragile points out that some of the best traders knew nothing about the commodities they traded: they only know the prices and the pattern of exchanges. We like to think that we are not being naive by basing actions on a model, but we do this, as Max points out, because it gives us a feeling of control. We demand that the risks we take are calculable. But what if they are not? What if there is no model that accurately represents the complex system! We abhor that thought. Max says:
As long as we believe we can calculate the risks of complex systems we are continuously increasing the risk of that total collapse.
More than that, we often are led to to the simplistic assumption that if something is wrong, it should be a single thing that is wrong:
We need to re-learn that in a complex world the notion of a single logical cause or a predictable outcome of an action is suspect. The constant, random stress is information that aligns the small anti-fragile system with the changes in its environment.
It would be time to stop being so arrogant in pretending that we know it all and have it all under control. We obviously do not!
Naive interventionism is that mode of behavior where response is dictated by a formula, but that formula is based on a simplistic model. Before introducing such a formula, the organization may seem chaotic, and makes a lot of random mistakes. But after instituting the formula, it avoids a lot of common predictable mistakes, but it also enshrines a lot of less common but still predictable mistakes. More importantly, adoption of the formula exposes the organization to an even greater risk:
As you make your business more efficient and more stable (by for example using BPM or outsourcing) you are unavoidably reducing its resilience by not being able to react continuously to changing external events.
This is the unintended consequence of formalizing the unformalizable.
Barry Schwartz talks about it as loss of wisdom. His story is about a child traumatized by a being forcefully removed from his parents for two weeks because of a tiny and forgivable mistake the dad made. Every person involved in the enforcement said it was a shame and completely unjustified, however the laws were clear that they were required to follow. Naive interventionism can be seen in this attempt to replace wisdom with a set of formal rules.
We get, once again, to the surprising quality of a complex adaptive system, that it needs a certain amount of stressing.
In a complex adaptive system constant stress is not to be mistaken as overreacting to noise but must be understood as environmental tuning information.
We need to re-learn that in a complex world the notion of a single logical cause or a predictable outcome of an action is suspect. The constant, random stress is information that aligns the small anti-fragile system with the changes in its environment
This really should not be surprising. If you want a football team to play well in the finals, you don’t spend all the time sitting around the table drawing pictures and precisely calculating distances and speed necessary to win. You don’t save all the energy you can to expend it all in a single final game. Instead, you send them out to play many many practice games. These practice games are not a waste of energy, but necessary to learn. Teams sports are a complex activity. It is the exercise of the team that makes the team learn. By experiencing a wide variety of situations, the players themselves learn how best to situate themselves. Emergency response teams do drills and exercises to increase their effectiveness.
Why should business management be any different? Practice makes the learning organization effective.
The most important thing is to accept that most of the real decisions and behaviors of a large organization are unknown. Instead of insisting that all such decisions be exposed or subsumed by a model, it might be better to let the experts do their jobs. Exercise the organization, measure the total output, but let the individuals figure out how to contribute the best. Avoid micromanagement via naive intervention.