Article: The Seven Fallacies of Business Process Execution

| by Stefan Tilkov Follow 5 Followers on Dec 05, 2007. Estimated reading time: 1 minute |
In a new InfoQ article, Jean-Jacques Dubray, InfoQ SOA editor and author of the InfoQ minibook "Composite Software Construction",  explores a new architecture blueprint for BPMSs that offers a cleaner alignment between SOA and BPM. Jean-Jacques argues that after  more than eight years of intense research, the promises of BPM have not materialized: we are still far from having the ability to use the business process models designed by business analysts to create complete executable solutions.

Jean-Jacques lists what he considers to be the main myths, the typical misconceptions with regards to BPM:

  1. Business analysts model their processes from a system's point of view
  2. Business users can easily learn BPMN and use all its features
  3. Business analysts should be able to create executable solutions from process models
  4. If we add a magical BPMS that create solutions directly from business analysts inputs we would not need to develop any of integration with existing systems nor to change existing systems of record nor to do any QA.
  5. Business Process Execution must be centralized
  6. Business Process Execution semantics can be derived easily from existing programming concepts
  7. The collaborative implementation paradigm, in which executable design is layered on top of the BPMN model, is the way to go.
He addresses each of these in turn, explicitly detailing his own alternative vision with regards to #5.

Read the full article here.

Rate this Article

Adoption Stage

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

A useful list of fallacies though light on the role of business rules by James Taylor

I blogged about this article and discussed some of these fallacies as they relate to business rules and decision management. Some of them could be less true if BPM vendors/practitioners took decisions more seriously and some of them are great analogies for similar issues in the rules space. Check out the post">here.

The Smart (Enough) Systems blog">My ebizQ blog
Author of Smart (Enough) Systems

Re: A useful list of fallacies though light on the role of business rules by James Taylor

GRAFCET and resource management by Azrul Hasni MADISA

I blog aout how GRAFCET can be used to do resource state management [ ] do take a look

Business-IT alignment should still be an aim by Marlon Dumas

I can not but agree with many of the opinions you share here. However, you seem to be overly pessimistic regarding the possibility of aligning IT systems with business operations by bridging analyst-level process models (e.g. BPMN) with executable process models. Granted, there is no magic button that will bridge the two. However, sound methods and wisely chosen tool support can go a long way in this direction. For example, if we align BPMN models with BPEL process definitions, we can at least partly elucidate the impact that business-level changes have at the implementation level. Oracle BPA, for example, is a modest step in this direction. Clearly, it's not a silver bullet, it won't magically solve the business-IT alignment equation, but still, it shows that something can be done to keep business models and code in sync.
The question of how "task-centric" should process models be is valid, but perhaps orthogonal to the BPMN round-tripping debate. I mean, we can argue if BPMN's task-centricity is the way to go for process modeling (at various levels of abstraction), but that's a separate point.

Re: A useful list of fallacies though light on the role of business rules by Jean-Jacques Dubray


BRMS are definitely my area of expertize and I second many of your comments. Just a couple of precisions:

>>There is no reason why this information systems cannot also decide how to act
Actually I totally agree, I thought this was conveyed by "advance.. the state", but I am glad you made it clearer.

>> I do think that collaboration is key - business users and analysts must be able to collaborate with IT to define processes and decisions

I think the question is really focused on "collaborate" vs "communicate". I would argue that "communicate" is a better value proposition than "collaborate". Collaborate implies for me long joint sessions, communicate conveys a better separation of work and clean handoff. You collaborate because you can't reach the point where this hand off is possible.

Re: GRAFCET and resource management by Jean-Jacques Dubray


thank you so much for bringing back so many memories. I used the GRAFCET in the early 1990s as I was building (industrial) process control systems for the semi-conductor industry (using Objective-C and NeXT on the front-end). I had actually tried in 1999 to discuss these concepts with the team I was working with at eXcelon so I do believe the concepts are good but I also believe that BPEL can do the job just fine. It is not perfect, but I can live with it, compared to starting over, I'd rather fix a few things in BPEL. If you read about "wsper" you will also realize that the core of the wsper programming language is very close to the GRAFCET. I had not looked at the language in almost 10 years, so I can't claim this was a conscious decision, but I argue I can compile this language in BPEL.

Re: Business-IT alignment should still be an aim by Jean-Jacques Dubray


thanks for your comments. I am very impressed by how far your team has been able to go. It means that BPEL as a language is pretty well designed, considering the fact that you are imposing the constraint of generating readable BPEL code.

I am actually not "pessimistic" about aligning BPMN with executable process semantics, I am simply saying that I am a bit surprised that this is the direction some people are looking at because it negates the existence of the "resource" as a key ingredient of the process.

Now it does not mean that your work is not useful at all, as you mention understanding the impact of changes at the process definition level is a key benefit.

I am pretty sure also that you could go in the other direction and provide a view of the "process definition" once the BPEL code has been implemented, such that business users would have automatically the "as-is" view of the process should they be looking at improving the process at a later stage (this has tremendous value because analysts spend a lot of time just understanding what is the current state).

Finally, another area of interest could be "verification" that the process implementation (based on an assembly of BPEL definitions) actually implements the process definition.

>> we can argue if BPMN's task-centricity is the way to go for
>> process modeling (at various levels of abstraction), but that's
>> a separate point.
The key question is whether you want to take the point of view: "A process is the collection of activities that advance the state of resources as they are transformed or consumed", this is a 100% task centric proposition (automated -james, this where decision services would fit, or human tasks). If you take the point of view that a process owns everything between the presentation layer and the data access layer, then I would say you are driven towards the kind of approach that you are exploring, and therefore you are faced to have developers to tweek the BPEL code, or business analysts to use BPMN in a way that write the correct BPEL.

My proposal is quite different, it starts from the resource / business entity level and assume their lifecycle to be fairly stable (and unbreakable, meaning a process cannot change the lifecycle of a resource once it has been defined). From that point the process is simply an assembly of resource lifecycles and "activities", there is much less code to write, the BPEL code has been written once and is reused in any process the resource is involved. I was actually quite amazed to see Dominique Vauquier comme to the same conclusion but coming from a pure methodology angle, trying to improve the way business analysts improve processes. I can only encourage you to read his article that I translated from French and that will be published on BPTrends next month.

The problem of your approach is that you can never be sure that a process will not lead to unwanted transitions in the lifecycle of the resource.

A team of researchers at IBM Research in Zurich is working on this topic. by Jean-Jacques Dubray

Marlon Dumas was kind enough to send me the link for the home page of Ksenia Ryndina which contains many articles that explains their research (which is very applied since they have already built some prototypes with WebSphere Business Modeler).

I have exchanged an email with Ksenia who confirmed the relationship between her work and this article. She recommends reading a couple of references available on her home page:

Good taxonomy and ontology for modeling data, service, process, human-wf by Kjell-Sverre Jerijærvi

You describe exactly how many misses out how important "resources" (domain objects) are in SOA. People are too fixated on the business processes when modeling their services and orchestrations, and thus forgets to create and govern a semantic canonical data model (or equivalent semantic transformations). This is what David Linthicum, Jack van Hoof, Nick Malik, you, me, and others blogged about this july. I have commented 'fallacy #5 Business Process Execution' and related it to the CDM discussion and service taxonomy (processes vs orchestrations) discussions in my blog:

Interesting article, but we still need executable business processes by Alexander Samarin

I blogged about this article - see

My comments on this article are based on my experience as an seasoned IT specialist; they are also expressed in my forthcoming book “Improving business process management systems” (see

What about using the Process Virtual Machine ? by Miguel Valdes Faura

Hi Jean Jacques,

Find hereafter my last post on the BPM Corner community on how the Process Virtual Machine could be consider as a core technology to implement most of the containers and modules required in the architecture your propose to handle business process.

best regards,
Miguel Valdes
BPM Corner,

repeating patterns... by Shaun Forgie

Firstly congratulations on what is one of the most lucid and well written process modelling articles I have every read. Believe me I read a lot...:-)

Historically the progression from procedural programming languages to object oriented programming languages has allowed us [humans] to build systems significantly more complex by establishing an appropriate set of conceptual and language related aparatus with which to manage this complexity with. Objects encapsulate data and behaviour that relates to a small piece of a much more complicated working system. This bigger composite solution is made easier to understand due to the fact that we can progressively dismantle it into smaller pieces. Thus complexity generally is managed through a recursive process of division and dismantling a big thing into smaller pieces that are individually easier to understand.

The fact that resources can exist independently of a process and the fact that they can participate in more than one process means that the need to understand and model them as separate entities is an important conclusion. A workflow process therefore can be viewed as a resource [state machine] co-ordinator responsible for transitioning one or more resources through one or more transitions.

The complexity has always been in trying to understand and describe a process environment where the triggers responsible for firing these resource transitions can originate indeterminately from either a human and/or system generated event. This problem has been compounded with the introduction of Business Rules and Scheduling sub-systems.

In my mind process models can be viewed as structural relationships between input and output resources where the relationships between resources can be defined as transition tables and valid process execution sequences of transitions contained with process resources.

Re: repeating patterns... by Shaun Forgie

I guess in the end its the recursive application of state machine semantics to progressively smaller and more detailed descriptions of a workflow / orchestration or process. The notion of a task container is really just another way of understanding how transition events get trigger.

Re: repeating patterns... by Jean-Jacques Dubray


thanks for your comments. I think you nailed it right there:

The fact that resources can exist independently of a process and the fact that they can participate in more than one process means that the need to understand and model them as separate entities is an important conclusion. A workflow process therefore can be viewed as a resource [state machine] co-ordinator responsible for transitioning one or more resources through one or more transitions.

I think that somehow "Computing" lead us the wrong path. The way we build information systems today is in no way based on information system concepts but rather on "programming concepts". It is now time to invent an information and process centric programming model. This is just the beginning

... and after 3 years... by Gonçalo Borrêga

We are publishing a whole new way to do things. OutSystems has a platform for agile software development that has been in use for many years. We too implemented BPM, facing all of this fallacies. On top of this experience we built our own modeling language. We came to realize that the tiny bit that makes the difference is that you have strong binding to your data model, and a strong binding to your interfaces.
Our platform already had that. When using the same concept with BPM, data/business rules and user interface (web screens), we reached a way of process design that can, not be given to, but created with the analysts. The developer then goes, and implement the business rules or the actual screens...

I think the solution is in having good technologies to bind these three elements: DataModel/Business Layer, User interface, Business Processes.
You can have on each a different set of specialized roles that work together very well.


great article by Thomas clerk

great, great article.
most people get confuse about CASE Tools, and how this tool can help our life, some other (see the comments) thinks we do need executable process (and so what! computer science have executable process since it started) what IMHO he need I learn about software engineering, read the paper search and DO learn what a CASE Tool is ( and it means Computer aided software engineering just reading the meaning we have to able to understand is "aided" not "magic" what a means in CASE.
And last but not leas choose the right CASE Tool for you and your company (or your team) and there is a lot of people working on that see and here as two examples....

and BMP is sometimes the right way to see logic and the business but not (never!) the architecture and the quality attributes behind the logic... so is impossible to think in a silver bullet.

just my two cents...

kind regards,

Re: great article by mohammad mohammadi

Thank you sir. I had a question. You mentioned an article which has proposed a method for transforming BMPN models into BPEL models. However, I noticed that the authors assume that each activity in the BPMN model is equivalent to a service invocation in the BPEL model. Does it have any contribution to identifying services? Could we call these kind of approaches, service identification approaches? In fact, they only cluster operations into services. I am really looking forward to knowing your opinion

BPM in-a-can (4 years later) by Jesse Starks

After years as a developer & systems-engineer in the enterprise, I took a technology position at a small financial firm, who previous to my arrival, adopted one of the BPM-in-a-can solutions mentioned in this article.

At the onset I was extremely impressed that two business critical processes had been implemented in this system. After 2 weeks of digging-in however, I was discouraged by the rats-nest of variables, functions, constants, rules, and under-the-carpet black magic that it took to achieve these goals. In other words, *NOT* the "anybody can code" solution it was sold as.

In one hand, I applaud the attempt to turn a process model into something executable. On the other hand, these process models lose nearly all of their clarity/value along the way -- by the time systems-integration & resource-lifecyle logic are bolted-on.

Coming across this article was quite validating, because it highlights, in-depth, how much consideration should be put into the theory of BPM to realize it's value. (Key example being resource-lifecycle versus process -- the value here cannot be stressed enough).

So here's the problem....

When BPM is sold in a can, the core principles are either glazed-over or hidden entirely. It undermines the core concepts of what makes BPM potential so strong. And leaves the impression (to the unknowing business user) that they have actually implemented BPM.

My expedition to understand this from every angle continues - but just wanted to provide some feedback from the wild. I found this article especially interesting, as it is four years old still quite relevant today (2011).

Re: BPM in-a-can (4 years later) by Jean-Jacques Dubray


Personally, I would prefer if these transformations would have a different set of semantics between "activity" and "service invocations". In general, most people level the service layer at the "data access layer" as opposed to the "resource lifecycle layer". My recommendation would be use subprocesses to define "process activities" and use BPMN activities to reserve low-level system interactions such as a service invocation

The general problem of BPMN / BPEL generation is that they completely ignore the resource lifecycle concept. In fact, resource lifecycles and business process definitions are complementary not isomorphic. It would be best to a) focus on a BPEL resource lifecycle implementation (Java or C# work too) and then use any process engine as a "process activity engine" merely a task engine that interacts with the resource lifecycle operations.

@Jesse: Yes, I think the problem is that lots of vendors have spent lots of money to build something that missed the target and unfortunately, the sales and marketing machine is charged to explain customers that what they build is actually a BPM solution and people should spend a lot of money to buy the product and implement processes with lots of consultants. The fact that no vendor or BPM consultant ever wrote a response to my article indicates to me that I was spot on and engaging any discussion at that level would be deadly. I have talked to many of them and frankly it makes me quite sad that 4 years later this article is still for the most part actual, from BPMN, to BPEL to BPM products. Eventually, most people that spent millions of their company drinking the vendor cool-aid are not spend to much time to show that they were wrong.

I cannot emphasize enough how far can go a "resource lifecycle" analysis for both SOA and BPM. A chief architect once told me after I introduced the concept to him that he understood more about his own business in one hour of RL analysis than in 2 years he had been with the company.

Decomposing Process Models by Terry Roach

In 2007 I was enjoying the luxury of a few years in academia, on a quest for executable architectures when I came across this intriguing article. The concept of Resource life cycles has been very influential on my thinking ever since. I’ve used this example of the job application process many times in discussions on process decomposition and come back to read the article often since then.

I've been planning a response to discuss extending these idea ever since and 5 years later have finally written it with Decomposing Process Models here:

Thank you for this excellent contribution, Jean-Jacques.

Best wishes with the diet, the 2nd thing you've inspired! ;-)


The Seven Fallacies of Business Process Execution: the state data of the life cyle by graham berrisford

I think I get it, except for the statement that the Application lifecycle is independent of the (annoyingly class diagram-like) data model. Surely the data structure must hold the state vector for the Application life cycle? Where is the primary key and state variable of the Application?

Re: The Seven Fallacies of Business Process Execution: the state data of th by Jean-Jacques Dubray

you only have an association between the two, there is no obligation for the datastructure to "hold the state". I can change the application data structure without impacting its lifecycle. That's what I meant. Lifecycle are highly reusable across versions of the types or variants of the same type. In other words, the properties of a type are orthogonal to the states of the lifecycle. Of course, an association needs to exist for the system to operate properly.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

22 Discuss