The Psychology of Failing Fixes

Home » Performance Improvement Articles » The Psychology of Failing Fixes

The Psychology of Failing Fixes

by Kevin McManus, Chief Excellence Officer, Great Systems

After tens of thousands of corrective and preventive actions have crossed my eyes and passed through my ears over the past thirty plus years, I began to see a definite pattern. It wasn’t the obvious ‘weak fix’ pattern most people see in their organizations. You know what the favorite fixes are – retraining, procedure expanding, and punishment focused fixes. Instead, it was a pattern of a less visible nature. I began to see a pattern of psychology in those failing fixes I was seeing over and over again.

What is the Psychology Behind Our Failing Fixes?

Here is what I saw in a nutshell. When a person attempts to write a corrective action to directly address a human error, they tend to gravitate towards recommending a relatively weak fix. For example, if we want to write a corrective action to address the problem of people not wearing the right work gloves, what do we often recommend? Most responses focus on making sure gloves are available and reminding the employee of the requirement, and need, for wearing gloves. How effective are those fixes? What is the probability of that behavior happening again?

By instead using a work systems gap or weakness as our initial corrective action reference point, the likelihood of recommending a much more systematic fix exists. If gloves are handed out as part of the job briefing each day, does this increase the chances that people will wear them? If supervisors are routinely measured relative to the degree that they consistently support personal protective equipment use, does this increase the likelihood that gloves will be worn? How does the daily work system reinforce effective glove use?

Such changes may sound simple. Before you toss my observation aside, take a look at the collective nature of the last fifty or so corrective actions you have written or reviewed. What patterns exist? What are your favorite fixes? How relatively strong, or weak, are they? Where do your fixes tend to fall on the hierarchy of controls? What percent of the time have you focused on trying to change people, instead of trying to change systems?

How Often Do You Rely on Failing Fixes?

Too many organizations rely on failing fixes such as reminders, discipline, and retraining. What is the case in your company? Our over reliance on weak fixes is often a result of the root cause analysis approach we use to find root causes. Traditional approaches such as the 5 Why technique or fishbone analysis allow human error to be viewed as a root cause. My experiences have taught me that this is not a best practice to follow. Allowing human error to be viewed as a root cause is a process error in its own right. The better option is to use a root cause analysis approach that looks for the systemic reasons behind those persistent process errors that exist.

All people make mistakes. If we want to produce error free work, we have to design our work systems so they discourage, versus encourage, human error. Our space programs and nuclear power generating companies get this, for example. They rely heavily on safeguards such as well-designed checklists and effective job preparation to guide people in doing their jobs well.  They don’t rely primarily, if not solely, on memory as an error prevention tool. What percentage of the time do you count on memory to help minimize errors? Is it possible that the use of more effective safeguards could significantly improve performance?

Human Error Should Not be Accepted as a Root Cause

For years, I made the mistake of seeing human error as a root cause until I started teaching the TapRooT® root cause analysis approach as a contract trainer. This approach is designed to force the user to look for the systemic causes of human error. In other words, human error is rarely, if ever, a root cause with this process. What percentage of your root causes are human errors? Is it possible that a different root causes analysis approach could lead you to better, work system focused fixes?

If we continue to try to write corrective and preventive actions to address human error directly, we will continue to write relatively weak fixes. That is the psychology of failing fixes, as shown in the graphic. The first fix to make in this case is to reject human error as a root cause. Instead, always search for the systemic reasons humans do things they themselves really don’t intend to do. How often do your fixes fail? Is it possible that a root cause analysis process shift, along with a psychological shift, could lead you towards a more error free workplace?

Keep improving!

Kevin McManus, Chief Excellence Officer, Great Systems

www.greatsystems.com            kevin@greatsystems.com

FOLLOW me on Twitter: @greatsystems

LIKE Great Systems on Facebook

CONNECT with me on LinkedIn

 

NOTE: if you found value in this article, you might also benefit from reading my new book “Error Proof- How to Stop Daily Goofs for Good”, which is now for sale on Amazon.com.

 

How to Prevent Failing Fixes Workshop Handout Part 1

How to Prevent Failing Fixes Workshop Handout Part 2

How to Prevent Failing Fixes Workshop Handout Part 3

How to Prevent Failing Fixes Workshop Handout Part 4

By |2018-11-21T05:05:38+00:00March 31st, 2017|Error Proofing, Process Improvement, root cause analysis|Comments Off on The Psychology of Failing Fixes

About the Author:

Kevin McManus serves as Chief Excellence Officer for Great Systems! and as an international trainer for the TapRooT® root cause analysis process. During his 38 plus years in the business world, he has served as an Industrial Engineer, Training Manager, Production Manager, Plant Manager, and Director of Quality. He holds an undergraduate degree in Industrial Engineering and a MBA. He has served as an Examiner and Senior Examiner for the Malcolm Baldrige National Performance Excellence Award for eighteen years. Kevin also writes the monthly performance improvement column for Industrial and Systems Engineering magazine, and he has published a new book entitled “Vital Signs, Scorecards, and Goals – the Power of Meaningful Measurement."
Show Buttons
Hide Buttons
Translate »
X