Continuing my recent post re simulation analysis and BPMN… In Use Case 2, the problem is process "bottlenecks." Usually static analysis gives you a rough idea of staffing requirement even without simulation. For example, if you create 100 instances an hour, and Task A takes 1 hour, you need around 100 people to perform Task A to keep up. But what if instead of creating 100 instances an hour, you are getting 800 instances overnight and your resource for Task A also is responsible for Task B

Bruce Silver, Contributor

January 22, 2007

3 Min Read

Continuing my recent post re simulation analysis and BPMN… In Use case 2, the problem is usually framed in terms of "bottlenecks." Usually static analysis gives you a rough idea of the staffing requirement even without simulation. For example, if over the workday you create 100 instances an hour, and Task A takes 1 hour, you need around 100 people to perform Task A to keep up. But what if instead of creating 100 instances an hour, you are getting 800 instances overnight and your resource for Task A also is responsible for Task B? Then simulation gives answers you can't get from static analysisBut it turns out simulation gives surprising results even in the case where static analysis says you have enough to do the job. For example, in preparing use case 2 scenarios for my BPMN training, for a given instance creation volume I would vary staffing counts in a set of scenarios. But when the resource utilization rate -- the ratio of active time on tasks to the resource's total available time -- got up to around 80%, the process could not keep up with the instance creation rate. I had heard anecdotally that 80% was some kind of rough maximum for real utilization rate, but I didn't think it would show up so dramatically in the simulation analysis.

I think in the past I commented on those "smart" analysis tools in BPM that give you hints like - big backlog at activity XYZ, try adding resources. Well duh! It doesn't take a genius to know that if you add resources to an activity you're going to reduce the backlog. But at what cost? Use case 2 is really about the tradeoff between cycle time, resource utilization rate and total cost.

And here is where the "standard" costing reports out of my simulation tool were not the most helpful out of the box… although they provided the data I needed in my custom reporting Excel template. That's because the cost you want is NOT the resource's active time x the resource cost per hour, but the resource's total available time x the resource cost per hour. As you add resources, the total active time for N+1 resources is the same as for N resources, so the cost measured the standard way doesn't change. But try telling the hiring manager the cost won't change by adding that N+1st guy. I had to make custom costing reports, but it was easy. The total cost for a resource is basically the active labor cost divided by the utilization rate, which is provided in the tool.

Dr. Bruce Silver is an independent analyst, consultant and author of the BPMS Watch blog. Write him at [email protected]Continuing my recent post re simulation analysis and BPMN… In Use Case 2, the problem is process "bottlenecks." Usually static analysis gives you a rough idea of staffing requirement even without simulation. For example, if you create 100 instances an hour, and Task A takes 1 hour, you need around 100 people to perform Task A to keep up. But what if instead of creating 100 instances an hour, you are getting 800 instances overnight and your resource for Task A also is responsible for Task B?

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights