Why Our Corrective Actions Fail

by Kevin McManus, Chief Excellence Officer, Great Systems

Tens of thousands of corrective and preventive actions have crossed my eyes and passed through my ears over the past forty-plus years. Over time, I began to see a definite pattern. It wasn’t the obvious ‘weak fix’ pattern many people see at work. You know what the favorite fixes are – retraining, procedure expanding, and punishment-focused fixes. Instead, it was a pattern of a less visible nature. I began to see a pattern of psychology that helped explain why our corrective actions fail.

What is the Psychology Behind Our Failing Fixes?

Here is what I saw in a nutshell. When a person writes a corrective action to directly address a human error, they tend to recommend a relatively weak fix. For example, suppose we want to write a corrective action to address a problem where people don’t wear the right work gloves. What do we often recommend? Most responses focus on glove availability and the use of rule reminders. How effective are those fixes? What is the probability that the behavior happens again?

If we instead use a work systems gap or weakness as our initial corrective action reference point, we are more likely to recommend a much more systematic fix. If we hand out gloves as part of job preparation each day, does this increase the chances that people will wear them? When supervisors routinely measure the relative degree that personal protective equipment use consistently occurs, does this increase the likelihood that gloves will be worn? How does the daily work system reinforce effective glove use?

Such changes may sound simple. Before you toss my observation aside, take a look at the last fifty or so corrective actions you have written or reviewed. What patterns exist? What are your favorite fixes? How relatively strong, or weak, are they? Where do your fixes tend to fall on the hierarchy of controls? What percent of the time do you focus on trying to change people, instead of trying to change systems?

Why Our Corrective Actions Fail

Too many organizations rely on failing fixes such as reminders, discipline, and retraining. What is the case in your company? Our over reliance on weak fixes is often a result of the root cause analysis approach we use to find root causes. Traditional approaches such as the 5 Why technique or fishbone analysis allow human error to be viewed as a root cause. My experiences have taught me that this is not a best practice to follow. Allowing human error to be viewed as a root cause is a process error in its own right. The better option is to use a root cause analysis approach that looks for the systemic reasons behind those persistent process errors that exist.

All people make mistakes. If we want to produce mistake free work, we have to design our work systems so they discourage, versus encourage, human error. Our space programs and nuclear power generating companies get this, for example. They rely heavily on safeguards such as well-designed checklists and effective job preparation to help people do their jobs well.  They don’t rely primarily, if not solely, on memory as an mistake prevention tool. What percentage of the time do you count on memory to help minimize errors? Is it possible that the use of more effective safeguards could significantly improve performance?

Don’t Accept Human Error as a Root Cause

For years, I made the mistake of seeing human error as a root cause. Then, I started teaching the TapRooT® root cause analysis approach as a contract trainer. The design of this approach forces the user to look for the systemic causes of human error. In other words, human error is rarely, if ever, a root cause with this process. What percentage of your root causes are human errors? Is it possible that a different root causes analysis approach could lead you to better, work system focused fixes?

If we continue to try to write corrective and preventive actions to address human error directly, we will continue to write relatively weak fixes. It’s why our corrective actions fail. The first fix to make in this case is to reject human error as a root cause. Instead, always search for the systemic reasons humans do things they themselves really don’t intend to do. How often do your fixes fail? Is it possible that a root cause analysis process shift, along with a psychological shift, could lead you towards a more mistake free workplace?

Keep improving!

Kevin McManus, Chief Excellence Officer, Great Systems

www.greatsystems.com            kevin@greatsystems.com

FOLLOW me on Twitter: @greatsystems

LIKE Great Systems on Facebook

CONNECT with me on LinkedIn

CHECK OUT my Amazon.com Author Page

NOTE: if you found value in this article, you might also benefit from reading my new book “Error Proof- How to Stop Daily Goofs for Good”, which is now for sale on Amazon.com.