It’s all Newton’s Fault

In working with organizations on their processes, I consistently find that most people are truly surprised to find out how their process are necessarily complicated.  Turning the question around, I started wondering “Why is it that we have the feeling that processes should be simple?”  This belief is at the center of a key problem people have in process management.  It represents a kind of blindness that keeps some from being successful in process management.  For now, let me blame it all on Issac Newton.

This is an excerpt from my article “The Quantum Organization: How Social Technology will Displace the Newtonian View” which will be published this June in “Social BPM: Work Planning, and Collaboration under the Impact of Social Technology.”  I will be covering this and more in my keynote at the Social Business Forum in Milan on June 8.

Somewhere between the years 1500 and 1700, there emerged a new way of thinking: the Newtonian view of the world.  This should not be credited exclusively to Sir Isaac Newton. It really is a philosophical approach to viewing the world influenced by a number of scientists and philosophers after the renaissance, starting possibly with Copernicus, Kepler, Gallileo and Bacon, with a profound embellishment by René Decartes, but in many ways crystallized and best illustrated in Newton’s “Principia.”

The key thing about the Newtonian view is that the world is a machine that is based on simple principles.  Gravity became simply a force that was proportional to the product of the two masses involved divided by the square of the distance between them. A body in motion remains in motion unless acted upon by an external force. This put in place the building blocks that could describe a large variety of natural phenomena.

It is important to remember that before this shift in view, the world was imagined as being filled with spirits and inscrutable invisible actors.  It seems that storms and floods were downright malevolent, caused by a failure to perform the right ritual that was invisibly connected to the occurrence of natural phenomena.   The 1600’s changed this, and people began to search for very simple rules that governed the world. Such as “F = ma”   This worked so well, that it seemed that everything could be predicted if you could measure the initial conditions carefully enough. There was, in principle, infinite precision possible. Using a measuring device fine enough, you could predict the future of world events.

Built into the Newtonian point of view is an aspect of “smoothness”: the idea that a rough approximation of the input conditions would give you a rough approximation of the output answer. Greater detail on the input would give you greater accuracy in the answer, but would never fundamentally change the result.

Turbulence disturbs Newtonians. When air flowed around an object, it did not snap smoothly back together on the far side, and instead it swirled around with extra motion that could not be explained by the force necessary to go around the barrier. It didn’t fit. Everything was supposed to be smooth, and there was nothing in the Newtonian view of the world that would explain the extra motion that occurred. Here was a case where as you got more detailed measurements, you got larger variations.  Turbulence, and similar things that did not fit were considered places where the rules “broke down” and generally should be avoided.

The Newtonian View is based on these central concepts:

  • externally observable—the essential aspects of the system are all observable and can in principle be known.
  • smoothness—The large approximates the small. A rough estimate will indicate roughly the outcome. Finer detailed measurements will allow one to calculate the result to finer precision, but will never yield a different result.
  • basic rules are simple—there are a fixed number of laws which are fundamentally simple, and can be applied easily given accurate initial conditions.
  • predictability—based on the rules and the starting states, you are able to calculate what has to happen to any degree of precision you need. If you are able to set up the initial conditions with sufficient precision, you will get predictability and repeatability to theoretically any level.

These concepts were applied in all fields of endeavor; biology, physiology, sociology, even psychology. The successes were so dramatic that it appeared to many that the world consisted of nothing but machines.

It is with this backdrop, that the great organizational thinkers like Fredrick Winslow Taylor, attempted to isolate the basic fundamental principles that allowed an organization to work.  While thinking of the organization as a machine, you identify the top level goals, then the steps to achieve that, break the steps into sub goals and sub steps.  You define all the action to finer and finer detail, eventually defining to the precision of what a single person does in a single situation.

Thinking of the organization as a very elaborate mechanical device, like a clock.  The clock is filled with gears that are all designed to interact perfectly.  The greater precision that is used to make the part, the better it will keep time, and the longer it will run without maintenance.  The goal for an organizational planner then is to construct extremely precise pieces, made out of very tough materials, with very exacting measurements, to fit exactly with the other pieces.

One problem with viewing the organization as a machine: machines wear out.  These machines perform only the function designed, and are very hard to adapt to another purpose. Organizations built on these principles are brittle.  What to do?

New Science for Organizations

I ran across a marvelous book called “Leadership and the New Science” by Margaret Wheatley, as well as another book she co-wrote with Myron Kellner-Rogers called “A Simpler Way.”  She was intrigued by the parallels between Modern Physics (Quantum Mechanics, Relativity, and all those non-Newtonian ideas) and the ideas that form the basis of mature functioning organizations.

Instead of thinking of a “clock” as the metaphor for an organization, try thinking of an “ecosystem.”  In an ecosystem you have a multiplicity of different flora and fauna.  Plants will be all sorts of sizes and shapes, and mixed together with others.  Yet there is a particular character to each kind of ecosystem.  Conditions will change: there will be hot, dry years, there will be cold, wet years, and yet the ecosystem can be very stable.  This is a far better view of an organization.  A leader is seen as a kind of gardener who neither designs the plants, nor makes them grow, but simply nurtures and trims in order to achieve harmony.

The desire to isolate the rules of an organization to a small set of simple equations, a completely Newtonian idea, leads you down a dead end in terms of planning and running a real organization.

Why would we stay locked in our belief that there is one right way to do something, or one correct interpretation to a situation, when the universe demands diversity and thrives on a plurality of meaning? – Margaret Wheatley

The Quantum Viewpoint

The most disturbing affront to the Newtonian view came from the development of modern physics and quantum mechanics.  The idea of the uncertainty principle run diametrically opposed to the idea that precision outcomes are obtained from precision parts.  Consider the electron: it is literally impossible to say exactly where it is at any given point in time.  Atoms are constantly moving about.  The idea that stretches our intuition the most is that even with all that commotion at the microscopic level, there are very stable structures at the macro scale.   Consider the desk or table in front of you: all of the atoms are constantly in motion and constantly interacting with each other, but the furniture itself is quite stable for a long time.   It is quite a bit like the ecosystem which has plants growing, dying, seeding, and regrowing again.

Note how the quantum view strikingly contrasts to the Newtonian view:

  • uncertainty principle limits what can be observed—things can only theoretically be externally known to a certain level of precision.
  • turbulence—there is a steady continual flow of interaction, a fundamental graininess to the universe
  • relationship is the fundamental unit—everything is related to everything else, and the interactions are complex and can’t be abstracted away from each other into simplified rules.
  • unpredictability—sensitive dependence on initial conditions means that small errors in measurement will build up to the point that after a certain period it is impossible, even theoretically, to predict the final state.

The quantum view opens the way to chaos theory where we look at large systems not as isolated actions that obey simple rules, but instead as rich, infinitely complex interactions that can not be reduced to a handful of simple rules.  All of this makes the Newtonian view seem antiquated.

Thinking about these possibilities makes my head hurt!  Planning a system on these principles would have been impossible until recent years.  The advent of social technology has opened the door to system that could be designed along these principles.  I hope I can post some more in the coming days on how key thinkers see social technology effecting organizations, and particularly how organizations can best leverage this capability, but I have used up all the space today.

In summary: We need to be mindful of how Newtonian ideas have lead us to search for simple, mechanistic models for our organizations.  A better model, however uncomfortable, might be to consider the organization less as a machine, and more as an ecosystem that is constantly in flux, constantly changing, and yet stable in the face of changes in the marketplace.  In a sense, a quantum organization.

At present, our most sophisticated way of acknowledging the world’s complexity is to build elaborate system maps, which are most often influenced by a quest for predictability. When we create a map—displaying what we think are all the relevant elements and interactions—we hope to be able to manipulate the system for the outcomes we desire. We are thinking like good Newtonians. But what we hope for is not possible. There are no routes back to the safe harbor of prediction—no skilled mariners able to determine a precise course across the quantum ocean. The challenge for us is to see past the innumerable fragments to the whole, stepping back far enough to appreciate how things move and change as a coherent entity. – Margaret Wheatley

13 thoughts on “It’s all Newton’s Fault

  1. Hi Keith, great article that covers unpredictability in complex adaptive systems with a nice comparison to quantum physics. I have done so in the past too (6-7 years ago) to make people think about human processes differently and was ridiculed. Good to see that others are picking up the line of thought. I remember our conversations about the difference between ‘complicated predictable’ and ‘complex adaptive’ in 2009 and how you said: ‘Max, you do have a point, but people won’t understand.’ Glad to read that you try the same now! And you do it with more patience than I do.

    Let me however add a few points here. The reason I stopped to use quantum mechaniscs and rather focused on complex adaptive systems, is because QM is principally outside our sphere or real-world observation. But the uncertainty principle is there not because things are moving around too much or that we can’t measure it because we influence the measurement. There are two main points to be made and while they are very relevant to process management as well they are even harder to understand. First, we CAN accurately measure the position of quantum entities, but then we can’t measure their momentum and vice versa. Second, the relationship to other entities is a key problem of QM, because in effect the measuring apparatus would have also be considered to be part of the QM system. If not we hit all the paradoxical situations. The consequence is that the electron that sheds a photon, apparently quite spontaneously, and the electron that receives it – maybe light years away – become a single QM system. It actually means that the receiving electron is co-responsible for the photon to be shed. In Einstein’s Relativity (not Newton’s world) that is impossible. In QM – when done with all consequence – that is a must. That was the dramatic conflict between Einstein and QM proponents such as Planck, Heisenberg and Schrödinger.

    The similarities are thus:
    1) I can’t measure all aspects of something and therefore not calculate the future even if I would know the exact functionality. (LaPlace’s demon who can measure and calculate everything.) The particle is not localized in space but can only be represented by a probability amplitude that turns to certainty when I actually measure (interact). The photon is where I look for it (famous double slit experiment). Simile: Processes are there because we look for them and we see what we want to see. The interpreter defines the result not the measurement or sender. That leads to point
    2) No event stands on its own. It is defined by the context in which it is received or identified. A completely unexpected, undefined event can be seen as we simply some state changes but have no clue if and how they are related and what the consequences are. Event relevance is the primary criteria. Who cares if the event of a falling piano was not above your head? The event is the same but the context is different. The context defines the meaning of an event (interprets it) and not its content.

    Nevertheless, I would today not describe organizations as quantum systems, but rather focus on the principles of complex adaptive systems, because in a quantum system the entities are not individuals and not unpredictable. The unpredictability is a systemic consequence of measurement. But yes, it does influence the chaotic aspects with sensibility to initial conditions.

    I thus see game theory more interesting than QM:

    Can Game Theory Improve Business Strategy?

    For the event-hanlding I prefer to look at Pattern-Matching, where the event and the receiver become more or less the same.

    Can BPMN and Rules identify Complex Events?

    My non-IT thoughts on QM are here:
    http://quantumresonance.wordpress.com/

    Once again, great post and good article make the point of unavoidable unpredictability. Thanks, Max

  2. Thanks for the comment! Yes, I fretted for a long time of whether to compare to quantum mechanics for many of the reasons you list. It is not clear that the intended audience has enough familiarity with quantum concepts to leverage that effectively. Still, I need some way to talk about “post-Newtonian” ideas. The change to quantum mechanics was very disturbing to the scientists at the time and I think there is a parallel to the change that organizational managers. It really is the case that Newtonian viewpoint makes people think of organizations as machines, and that blinds us.

    I knew the comment about not being able to know the position of an electron would be caught by someone particular about the details. Of course, you know, you can know either the location or the momentum, but the subtlety of this is not widely appreciated. What it really means is that the more precisely you know the location now, the less ability you have to know where it is going to be a moment from now. I suppose I should have said that you are unable to predict where it will be … that is probably the right way to say it to avoid this correction.

    I agree that the QM comparison is risky, but there is one aspect that I really like: the way the uncertainty principle causes a fundamental fuzziness to the universe. The problem with the Newtonian view is that it leads one to believe there can be infinite precision and infinite predictability. The idea that you can not predict the details leads a Newtonian to believe that nothing can be predicted at any level. But the amazing thing is that even though you can not predict the position of an electron, you CAN predict the location of the desk. Similarly, even though you can not predict the actions of an individual, you CAN however predict the actions of an organization.

    Thanks for the comment, and the good advice.

    • Hi Keith, sorry if my comment came across as an correction. I meant to say why I felt you were originally right to warn about using quantum physics comparisons. But yes, they do work but it is hard to gain understanding. So we do agree, but the comparison can be easily misunderstood.

      One more point to your reply. Yes, we can predict the stastistically average positions of all the electrons of a desk within the confines of the desk. We can statistically predict the behavior of a group of people as well. But once again I would caution to use the Newtonian view. People are individuals and a social network is therefore not like physical matter but a complex adaptive system. It is this illusion of statistical analysis and predicability that causes people to sell, buy and use predictive analytics that are of very doubtful benefit. I call them self-fulfilling prophecies. You measure what you model and not what it really is.

      Processes are there because we look for them but that does not mean they are there and that all individuals will interact in them the same way. So we can only measure outcomes that have only weak causal connectivity to the process (customer satisfaction statistics). We then start to measure our model and try to optimize it without understandig the true consequences of any ‘optimization’. They will vary the process outome all over the place in reagrds to individuals.

      Therefore the solution is not to create an optimal process (the ideal location/path of table particles as the statistical average of all) but to allow each particle to vary behavior according to their current slighty chaotic (thermal entropy) variance.

      It is the statistical illusion that stops people from looking at the real thing – OTHER PEOPLE!

      Thanks again, Max

  3. Pingback: Social BPM « Euroside

  4. Keith – Great post and absolutely on the money.

    You may remember when I was publishing Process Product Watch (I know it was a long time ago … about 94), anyway, we did point to the organic lens for understanding the organization. At the time, I was reading a Thesis from a scandinavian PhD student who’s name escapes me for the moment.

    That is one of the reasons why “After Thought” by James Bailey resonated so clearly (published in 97) again making the connection between strategy, mathematics and emergent behavior. BTW – that’s another great read along this vein … more an alternative history of western philosophy.

    Personally, I believe the jump to QM (tipping hat to Max) is perhaps a jump to far here. I am not sure it is necessary once you get past the highly mechanistic / bureaucratic perspective and take on a more holistic ecosystem view.

    Also I think that this sort of thinking has great impact on what we think of as process architecture. Back to my position that Case Mgt (adaptive/dynamic … who cares), is more a design pattern – a way of thinking. That the unfolding nature of the world provides an opportunity for workers to respond by applying procedural fragments as they see fit, within the context of an overall case.

  5. Keith,
    I enjoyed reading the post and I think from a techies standpoint you are right on the money, and that is where I see the problem.

    When talking about organizations and processes that enable work, we need to be talking to the senior business people – not the techies. So I am pretty sure an article on Newtonian vs. Quantam mechanics won’t make much of an impression on them – no matter how well it describes things.

    I think that most of them intuitively know that knowledge work is not predictable work, and that in an knowledge based organization things act differently than on an assembly line. Processes exist but they interweave with human collaboration, interaction, negotiation, discussion etc. For me that is the heart of the issue – but once you say “Process” you tend to have only the technical (or semi-technical) people pay attention – and they ignore all the stuff that happens around the process (even though that may most of the work, or cost) – and you get back to a simplistic structure workflow oriented view of Process.

    I remember a colleague of mine back in IBM thought we should use anthropologists as process analysts, not techies. I doubt if that will ever happen, but at least it makes a statement about where the complexity in business processes actually lies.

    Jacob Ukelson – CTO ActionBase

  6. I like Max’s comments and glad he did all the Heisenberg stuff so I wouldn’t have to write that down : ) My high energy physics is a little rusty since my internship at SLAC many years ago!

    I would caution (in my own experience) that there is a difference between a useful abstraction and a(n) (over-)simplification. An abstraction does not negate or deny the existence of complexity, it just allows the mind to organize complex systems into more manageable chunks that can be understood. Obviously, if one thinks that the abstraction represents the whole system, then you can run into issues.

    I once worked with someone who could describe everything he was working on in excruciating, and correct, detail. But we used to describe it as an obfuscation field because there was absolutely no abstraction in his communication – and so you were bombarded with all the low level details and then had to build the results into your own mental model yourself (abstraction). It really was quite extraordinary:
    Me: “So how’s the work going on that configuration module?”
    Him: “So I found this really interesting defect in testing where if you take this that and the other…” followed by 100 similar details, not even grouped by function or area of code or module or deliverable. How about, “it is going well, I just ran into two or three snags that should be resolved by tomorrow”.

    Finally, let’s give Newton some credit. Some of his simplifications (calculus) were pretty useful. ; )

  7. Keith
    I listened with interest to your Fujitsu webinar today on BPM. I was the one that asked the question about process knowledgebases … and while I have not read all of the exchanges here … the ecosystem analogy and QM are heading in the right direction. I have now mapped the research and development process ecosystem and can show you the processes, process conformations, event chains, their value straems , their pathways and value flows and can … for this largely social ecosystem … show you the right recipe … the mix and balance among public R&D, private R&D and collaborative R&D that optimizes the value out put for nations (federations of enterprises) as well as for enterprises themselves.
    I would like to chat at your convenience and show the ecosystem architecture you can use for most any process universe.
    Dr. Alan Cornford
    604-961-1658
    abcornford@shaw.ca

  8. Pingback: Social Business Doesn’t Mean What You Think | Collaborative Planning & Social Business

  9. Pingback: Misunderstanding Complexity: Convictions for Earthquakes | Collaborative Planning & Social Business

  10. Pingback: Wirearchy – a pattern for an adaptive organization? | Collaborative Planning & Social Business

Leave a comment