Home    ITIL  Index

The BPM "Fantastic Four"


Sep 7, 2007
By

Stephen Coco





The Human Torch: Process Automation

Automation is like a blowtorch: a wonderful tool until it burns the house down. Automating a broken process without first fixing it or, at the very least, making it “automation ready” can immeasurably damage your business operations.

BPM has changed the traditional relationship between IT/vendor and functional owner. Project ownership is no longer handed over at requirements development; co-ownership is now prevalent throughout the project lifecycle.

Here are some best practices for process automation:

Be aware of changing needs. Traditionally, if a customer’s needs changed between order and delivery, the customer received what was ordered, not what was needed. Today’s processes must respond to feedback and sudden changes in customer needs. Process change is inherent to BPM and should be considered part of the ‘living’ business requirements. Fixing requirements can leave organizations in a fix.

Keep your methodology flexible. Changing line business requirements, high level change in the direction of your business, and evolving customer needs all expose processes and systems to ongoing evolution, you should avoid traditional waterfall methodologies and other development or project-delivery methods involving non-iterative styles. Consider instead more flexible, iterative agile methodologies like Scrum or RUP.

Understand up-and-downstream impact. You may reduce handoffs between functions, but you will never eliminate handoffs by automating processes. BPM planning must account for overlapping systems and the downstream impacts to process changes.

Automate measurable processes. “Smart” BPM, the convergence of BPM and business intelligence, relies on empirical analysis to evaluate effectiveness. This is a two sided thesis: automate because a process needs to be objectively analyzed and automate those processes that can be gauged objectively.

Contrary to popular belief, start with high-impact high-risk processes first. With supporting data, process effectiveness and efficiency is easily understood and reported to all interested parties.

Mr. Fantastic: Operational Excellence

As the leader of the Fantastic Four, Mr. Fantastic, like operational excellence is the keystone to successfully avoiding Dr. Doom. Pit a good person against a bad process, and the bad process wins every time. Many organizations spend too much time trying to fix effective staff rather than broken processes. Remember, your employees are not superheroes, so don’t blame them when it’s your processes that are at fault.

Best practices for operational excellence:

Employ controls. Identify the key controls in all processes. For managing total time to completion and tracking against your controls, understand your service-level and operating-level agreements.

Do not see automation as the end. You haven’t “gotten there” by automating; you need to continually measure the impact of value-based steps and rethink your processes.

Trust your data. Base BPM decisions on your data, not on speculation. Data doesn’t lie—at least not as often. Use relative ranking based on empirical analysis to shift your priorities for improvement efforts.

Use philosophies to change behavior. Instill in your workforce the concepts and tools that lead to constant “process thinking.” For long term process optimization consider philosophies geared toward continual process improvement — Lean Six Sigma, for example.

Stephen Coco is an associate principal at Intellilink, a management consulting firm that improves the productivity of knowledge worker organizations. His areas of focus include; talent management, workforce optimization, process redesign and business process management.




Comments  (click to add your comment)

Comments

    Name or nickname

    Email address

    Website

    Write comment
    You have characters left. (Maximum characters: 1200).

     


    IT Management Daily Newsletter




    Most Popular