Improving the Business Process Simulation

by Boris Lublinsky on Mar 18, 2009 |


In his recent post Keith Swenson notes that usefulness of business process simulation is often debated and describes two camps of people participating in such debates.

The simulation optimist probably expects a little too much detailed insight into how future cases will flow through the process.

The optimist expects precise quantitative measures of what everyone in the process will be doing. Simulation can provide this information, but in order to do so, the simulator must be provided with precise quantitative measures on the cases that will be coming, the amount of time it takes to handle ever activity, and precise models of the workers who will be doing their jobs. This works passably only in cases where you have extremely large numbers of essentially identical cases and for job where the individual differences of the skill, knowledge, or background of the workers makes no difference at all. Even in these cases, tracking down accurate information on activity time and case distributions can take more effort than the value of the result of simulation. While the optimist has inflated expectations, there is no denying that simulation can give you good information on relatively straightforward scenarios, like "What if my case load increases 20%, where will I have to increase resources" or "if I can eliminate one step for 90% of least-risk cases, how do the time savings compare to expected cost of the low probability problem?"

The pessimist typically does not expect much from simulation, but even he presumes that:

... running a simulation provides an ability to see the dynamics inherent in the model, as an aid to understanding the process as it is modeled. Formal graphical models are things that are not entirely natural to most people, and we can all use help understanding the model. This is a kind of "debug" capability that allows the business analyst to find basic problems before they get into the next step of development.

Evaluating the importance of simulation for these two camps, Keith notes:

For the pessimist, simulation is very important because it allows the business person to work the problems of the model out at the business level, before it is transformed into a model for the system domain. Since mistakes at one level are magnified after transformation, it is important to use whatever techniques are available in order to debug before going to the next level.

He also notes that:

For the optimist, the problem is that simulation of the business domain model may, or may not, have any relevance in the system domain or in the enactment domain. Spending a tremendous amount of time building a precise model of the workers and the caseload may be for nothing if the model is transformed for actual execution. Finding that a particular resource level produces optimum flow through the business model may turn out to not be the optimum solution once the model is transformed. Simulating in the system or enactment domain will give you a better optimum, but these models don’t have meaning for the business analysts, and it may not be obvious how to translate the optimum situation back into the business domain. Simulation optimist are finding that simulations are significantly complicated when the model is transformed.

Bruce Silver further elaborates on the topic, noting, that process simulation "could be of great value", but only if the simulation tools were any good. He defines 7 most important features, which should be supported by the simulation tools:

  • Event probability and time of occurrence. In BPMN, events provide an expressive visual language for describing the exceptions that occur in real-world processes. In fact, these exceptions are usually at the root of performance problems in the as-is process. To project the expected to-be improvement, you need to be able to assign a probability and mean time of occurrence for events in the process model.
  • Repeating activities. BPMN has two types of repeating activities, called looping (DoWhile) and multi-instance (ForEach). You need a simulation parameter to model the number of iterations.
  • Instance properties. In most simulation models, the probabilities at each node are uncorrelated. In real-world processes they are highly correlated. For instance, the duration of a particular task, the probability of taking a particular gateway output, and the probability of some event occurring usually track together. In other words, certain classes of instances tend to take longer, tend to take path 1 rather than path 2 out of the gateway, and have a higher than usual probability of some in-flight event... One way is to define the simulation parameters not as a simple number (mean and standard deviation) but as an expression of one or more instance properties, such as orderValue, which could take values high, medium, and low. This makes configuring instance generation more complex, as you need to define the rate of each type, but it could provide much better output.
  • Contingent resource assignment. Most simulation tools let you assign tasks to roles with some defined cost-per-hour or cost-per-use parameter, or groups of same. But not many let you say assign to role A as the primary resource, but if no member of role A is available then assign to role B.
  • Prepopulation of backlogs. Simulation models generally start empty, meaning no instances in the system. Besides the obvious distortion at the beginning of the simulation period, this does not allow the resource allocation optimization use case to apply when it is really needed, i.e. when an actual running process is jammed up with backlogs and you need to try various alternatives for working out of it.
  • Access to raw output. The prebuilt metrics and charts provided by simulation tools are convenient but they rarely provide the detail you need for real analysis. For that you need the raw output, one record for each process instance, and also one record for each activity or event instance, dumped into Excel or a database. From that you can create histograms, provide activity-based costing, and perform other useful analysis. Without it you basically have eye candy.

It looks like both simulation optimists and pessimists can benefit from improving the simulation tools, which will improve the quality and reliability of simulation results.

Rate this Article


Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

General Feedback
Marketing and all content copyright © 2006-2016 C4Media Inc. hosted at Contegix, the best ISP we've ever worked with.
Privacy policy

We notice you’re using an ad blocker

We understand why you use ad blockers. However to keep InfoQ free we need your support. InfoQ will not provide your data to third parties without individual opt-in consent. We only work with advertisers relevant to our readers. Please consider whitelisting us.