Pseudoscience? by Kevin McManus
First published in Industrial Magazine March 2004
The dictionary defines pseudoscience as “a theory, methodology, or practice that is considered to be without scientific foundation.” The application of this term is more often directed at things like astrology or metaphysics, but if you take the definition literally, it could also encompass a majority of the management philosophies and tools that we have all grown to know, sometimes love, and most definitely invest lots of money in. At the same time, maybe that is why the phrase “management science” came into being.
In the world of business, we constantly fight the battle between making decisions based on opinion and using data to help us make better decisions (is this science?). At the same time in the ‘real world', we consistently see different groups using statistics, polls, and other forms of so called analysis to provide ‘facts' that help support or disprove theories. In many cases, we go to extremes to use data to prove the obvious while we fail to use data to clarify very unclear issues. The contradiction is enough to make one wonder “How much science do we need to employ as we make decisions?”
It was Dr. Deming who said “No theory, no learning.” He also provided us with the quote “the most important things are both unknown and unknowable.” With these two quotes in mind, it brings to question the place that pseudoscience has in the world of business. Do we have to prove or disprove all theories in order for them to be accepted and applied? How much proof do we need before testing out a given management approach? I hope we don't have to wait on the proof, because some of the theories about system relationships that I consider to be the most important in a high performance workplace would also be quite difficult to actually prove.
For example, it is my belief that as organizational size increases, the percentage of effective leaders in a given organization decreases. In order to prove the validity and strength of this correlation, we would have to assemble a collection of companies of various sizes, measure leadership effectiveness in those companies for a given period of time, and then analyze the results to draw our conclusions. While this type of effort might make for a great doctoral thesis, it misses the point. By simply asking the question, making stated assumptions, and having true dialogue about our theories, we can actually find ways to make our leaders better without doing all of the research.
Similarly, I theorize that company performance increases as the percentage of time employees spend on the job with external customers increases. I have witnessed this effect in real life in more than one organization, but I have no hard data to prove the theory. Does that mean that we should not make an effort to increase the average amount of time our people spend with our customers? Do we need to have proof that the theory is valid before we initiate strategies to take advantage of the logic it is based on?
In the educational arena, people earn their livings proving and disproving theories. In many cases, the research and its findings are not even used (or known about in the ‘real' world), even though someone devoted a lot of personal time and energy to do the research work. In the business world, we more often go with opinion because we don't have the time, money, or inclination to do detailed analysis in order to validate a theory that seems like common sense. Personally, I believe there is a middle ground, with hard data being necessary at times, but not a mandatory factor for theory acceptance and application.
I also believe that organizational performance improves in several dimensions as the level of employee participation in activities away from their daily job increases. I adopted this belief as a result of working for five years in a highly participative company and seeing a variety of best practice companies work in this manner through the Baldrige national quality award. While I have peers that just as passionate about having high levels of employee involvement as I am, I also know that there are others who hold essentially an opposite set of involvement-based beliefs. Unfortunately, I have little hard data to make my case.
Most of us know that there is a point where continuing to increase throughput rates will begin to compromise product quality. We accept this theory to a degree, but we do not go the extra step to actually define where that point is and to manage our key processes at the point of optimum, as opposed to maximum, throughput. Our ‘pseudoscience' theories carry enough weight to make us cautious as we push for higher levels of output from our people, but they are not salient enough to reign in our desire to be the fastest, biggest, or best. Do we actually have to do the research in order to know when we are beginning to ‘kill the goose', or can we come to our collective senses before we reach that point?
It is more important to be aware of when we are using opinions to make decisions versus facts than it is to always insist on facts. In The Fifth Discipline, Peter Senge stated that we need a thought revolution in management if we are to move to sustainable levels of higher performance. For this reason, dialogue in the form of team learning is considered to be one of the five key disciplines. Practicing dialogue helps us recognize the assumptions we are making, learn from those assumptions, and reach higher levels of collective understanding. Dialogue can bridge the gap between fact and opinion if we make a concerted effort to learn and practice the skill.
I personally do not know where science begins and ends in the business world. I do know that gut instinct plays as much of a role in organizational success as fact-based analysis does. The challenge lies in learning to define as a leadership or process team when we need to use facts and what degree of analysis is needed. We need to recognize that there are times when management pseudoscience is just as effective as management science would be. Finding the balance is much more important than finding the answer.
Would You Like to Learn More?
Great Systems! can help you design and improve your key work systems in three ways – system assessment, one day system design workshops, and ongoing system evaluation and improvement coaching. If you are interested in learning more about these services, please send Kevin McManus an e-mail at email@example.com or give him a call at 206.226.8913. Keep improving!
Click on one of the following links to learn even more about Great Systems! and the types of systems improvements I can help you make:
“The only thing I know is that I do not know it all.” -- Socrates