This is another in a series of posts discussing why it is time to move beyond the process model. The last two posts were about BPMN and CMMN respectively, however the actual problem is deeper. Even if you found the perfect modeling notation, the fact that you have to bring everything together into one place is a bigger barrier to success.
Defining Omniscient View
A process model, like any program, takes an omniscient view of the work being done. The model explicitly represents everything that anyone can know about the process. The process model contains a set of variables and those variables are expected to be a representation of the complete state of the process. The state of the process exists in one software system in one place in the computer system. The designer of the business process model is expecting to be able to look at a business process and have immediate access to all parts of the business process.
To create this omniscient view of the process the business process modeler—often a business analyst or business architect—talks to all of the people in the organization effected by the process. The business analyst must interview people from all the different roles in the process to find out what they do to further the business process. This is called process discovery. Process discovery brings all of the information that you need to know for the business process together into one place. The business analyst then draws the process with the full knowledge of every part of the process. The business analyst will add all of the activities that could be performed by anyone, will define the order that those activities come in, and will define a set of variables which hold all the possible values they could be known to business process.
Being Complete is a Burden
This can be a very large job. People are by nature very adaptive. The various parts of the organization have figured out how to get their various jobs done. Capturing all this, and bringing it together can be a lot of work. One of the largest problems with BPM projects is underestimating the effort to just understand the process as it is. This is due in no small part to the fact that most of this information is tacit, and performed by people who have a hard time explaining exactly what they do. When they are able to explain the happy path, they rarely have a good understanding of how they handle the error cases and other outliers. Their initial simplistic notions of how the process works is often shown to be inadequate and not workable for any unusual case. There is a significant amount of scenario testing and reworking before a real process can be hammered out. The workers being interviewed must squeeze this in during “spare time” between the various responsibilities they are carrying otherwise. Even a simple project can take months, and I have seen moderately complex projects take years just to understand the as-is process.
If it is not possible to interview everybody in a particular business process, if key individuals are spread across a large geography, or if they’re simply so many of them that they can’t all be contacted, then the ability to bring all that information to one process designer becomes a serious bottleneck. This is particularly important in cases where the job being done varies across geography and depends on special rules and regulations on a community by community basis. In many legal scenarios it is simply impossible to gather all of the special requirements imposed by cities towns municipalities, counties, states, and nations into a single consolidated place. The sheer inability to collect into one location everything necessary to make the business process becomes a barrier to the success of that business process.
Speed of Gathering vs. Rate of Change
In many cases speed becomes the limiting factor in bringing this information together. Laws and customs change regularly. To create an effective business process this information would have to be collected in a shorter time than the rate of change—for example 3 months to 6 months at the most. If the collection time takes longer than this the business analyst run the risk that the details change before the process is produced. Doing an incomplete job of collecting all the background can create the risk of imposing the wrong process on parts of the organization. The wrong process can be worse than no process at all. The business analyst then must gather all the information, and produce a running process significantly faster than the organization, the market, the law, or customs are changing.
Many purchasers are sold a BPM system with an unrealistic expectation of the time and effort that it will take to discover the process. The slick product demos start with the assumption that you already know the process, and then proceeding from there the implementation can be generally done quickly. In my estimation, by the time you know the process you want to implement, you are 80% finished with the project. The cost of bringing the information about those processes together into a single location so that a single process designer can create the process model is probably the largest, most underestimated part of any BPM project. Part of the reason for this surely is that working together in an organization is a very natural thing for humans to do and so we don’t appreciate the full subtlety to what is going on. It can be an abrupt awakening when these organizations realize that the effort to make the process involve so much discussion among the participants to find out what is happening today and what should be happening with an automated process. Process discovery can be quite expensive.
Must be Both Quick & Complete
Without process discovery it is impossible for a business analyst to make even a simple model of the business process. This is not a problem that will be solved by just trying harder to make it work. This book will propose that a completely different approach can be taken which completely avoids the need for an omniscient model as the backbone of how the process works. Instead of demanding to bring all the information together into a single point, we let each part of the organization—down to the individual level—define its own part of the process and then we can automatically synthesize the whole process.
Not only is process discovery significant work, but also this work must be done quickly, and it must be done across many part of the organization at the same time. The accounting department frequently is stressed by monthly or quarterly closing. The development teams are stressed around their releases. Marketing is overly busy around a trade show. When you look across the organization there is always something that is a good excuse to put of process discovery to later. Avoiding process discovery not only eliminates a lot of work necessary, but it also eliminates the need to coordinate work across many parts of the organization at the same time. The ESP model will still require people to manage information about what they do, but it will leave each department, and even each team free to do this at a time that makes sense for them.
The problems an omniscient process model does not end when process discovery is complete, as we will see in the next post on agreement.
And if you want to know where this is going, all these posts are excepts from the new book Beyond the Business Process Model.
Pingback: Business Process Models are Not Agile | Thinking Matters
Pingback: And CMMN as Well | Thinking Matters
Interesting article, Keith
My focus is on Critical Infrastructure Protection (countries) and Corporate Infrastructure Protection – building best practice protocols only comes after construction of a Knowledgebase consisting of entities that detail threats/risks to infrastructure, vulnerabilities, remediation, resilience, readiness, event avoidance, incident mitigation/management & recovery.
Our objective is to have an “omniscient view” and the main method we use to achieve this is
“Resource-Based View” (RBV). The pace of operational activity in the area of run-time Infrastructure Protection is too rapid for other than use of automated sensors and reliance on user “muscle memory”, except in the recovery stage. The idea is to prevent incidents so that recovery is never needed.
“Being Complete” is indeed a burden as the average Kbase comprises 5,000 -10,000 documents or more, which, to satisfy RBV, need to be accessible/viewable from one computer screen. Some of the document inventory changes daily (i.e. terrorist attacks).