Home

The APQC Blog

How to Make New Technology Improve Your Process and Not Ruin It

APQC asked process experts about the new technology poised to have the greatest impact on process improvement and performance. Including the struggles of adopting new technologies and how not to fail at automation. These experts will be speaking at APQC’s Process & Performance Management Conference  October 3-4 in the Optimizing Data & Digitalization breakout session track.

Process experts:  

  • Dr. Mathias Kirchmer – Managing Director and Co-CEO, BPM-D
  • Charles Jessup - Business Facilitation Manager & Digital and Innovation (DSI), TechnipFMC
  • Jason Harms – Manager of Business Process Improvement, Cherwell Software
  • Marisa Brown - Senior Principal Research Lead, APQC

In your opinion, which new technology is having the biggest impact on process and performance professionals?

Dr. Mathias Kirchmer: I think the biggest impact is actually achieved through the appropriate combination of different technologies, aligned with specific requirements of a business process. Technologies like robotic process automation (RPA) as part of intelligent automation, AI, OCR, Blockchain, or process analytics have a different importance depending on the context.

Charles Jessup: Customer journey mapping and service blueprinting have the biggest impact. Historically, process mapping has been internally focused, but these two technologies invert that focus and use the power of process mapping to improve a customer’s experience with your company. Which aligns with the customer-centric priorities of most organizations.

Jason Harms: Early in my career, most applications were connected to data through an internal network, a direct internet connection, or on the desktop itself. Depending on the system or network in use, this limited on when a task could be performed. As applications are enabled in the cloud, we can imagine more possibilities with reduced dependencies on network constraints and an increase in availability of “real time” data, no matter the source.

Marisa Brown: Cloud computing. It’s the most widely adopted, and it helps boost processing power and decision making through easy access to datasets. Traditionally, organizations in many industries have faced challenges caused by disparate data and few means to access information silos simultaneously to generate insights. Using cloud computing, organizations can connect all data sets from disparate functions such as finance, supply chain, operations, HR, and sales.

Where do organizations struggle when adapting new technologies?

Dr. Mathias Kirchmer: The biggest issue is the misalignment between technologies and the related people and process requirements. Without a clear understanding of process improvement opportunities, the selection of appropriate technologies and their value-driven implementation is almost impossible. Once this is resolved, efficient roll-out is key – avoiding unnecessary process variants (e.g. by leveraging company-specific process reference models.)

Charles Jessup: In dropping the laggards.  For example, we still use fax machines even though email is superior.  Why? Because some organizations and governments require information via fax.  If we want to interact with those other organizations, we can’t completely get rid laggards like the fax machines.

Jason Harms: Understanding “how” people interact with new technology can be difficult to prepare for. People learn at different speeds, learn best through their method of choice, and will typically interact with systems through the path of least resistance to complete their tasks. It is through great emphasis on documenting operational processes, practicing change management, and conducting post-implementation reviews that organizations can discover and address the human interactions that we did not predict when planning for the new technologies.

Marisa Brown: Budgetary constraints – costs seen as outweighing the benefits. Lack of consensus and resistance to change.

Process automation is sometimes seen as a “cure all” for broken processes. How can organizations make sure automation doesn’t just continue bad processes?

Dr. Mathias Kirchmer: In most cases a simplification and, where appropriate optimization, of processes is a pre-condition for a successful and intelligent automation. A process segmentation into the 10-15 percent of high impact processes and the remaining 80 percent as commodity processes helps to make this approach efficient.  Commodity processes are, in most cases, improved by applying common practices. For example, by leveraging industry or software reference models. This allows a focus on the high-impact processes where a more detailed analysis and innovative design really pay off. Hence, best practice is to prioritize and segment, then simplify and optimize pragmatically. And after this it is important to establish a process management discipline that sustains the achieved benefits.

Charles Jessup: You can do a proper value stream mapping of the process before you automate it.  That way you know where you might be baking in waste.  Reduce the waste first, then automate it.

Jason Harms: With automation, existing process errors become repeatable and expand to a larger audience. Organizations should treat automation as a tool, not a solution, to help achieve their goals. I enjoy the agility of process management and how it can be used in any organization, not bound to an industry or technology. Organizations of any size, type and process maturity, can avoid many pitfalls of deploying automation by effectively documenting and managing their processes.

Marisa Brown: First define the processes, then ensure they are consistent and repeatable. Sometimes, though, automation may require changes to processes so it’s important to map the processes pre and post automation.