Logic and plant engineering: become a more effective problem-solver by avoiding these common fallacies
Troubleshooting problems in plant equipment and systems requires a flawlessly logical thought process.
Troubleshooting problems in plant equipment and systems requires a flawlessly logical thought process. Like many of the skills needed in the world of plant operations and management, logic is something maintenance managers and facilities engineers are just assumed to have. Yet good logic is a skill that rarely comes naturally.
Lack of training in critical thinking leaves us all the more prone to errors in logic, which are known as fallacies. Only by learning common fallacies and avoiding them in our day-to-day thought processes can we develop the critical thinking skills we need to be effective problem-solvers.
Post hoc, ergo propter hoc. One of the most common fallacies in the plant environment is post hoc, ergo propter hoc, or after this, therefore because of this. This fallacy assumes that because one event occurs just prior to another, the first event caused the second. In one recent case, an operations group asked for help solving a problem with the disruption of the flow of one ingredient in a process. The group felt the problem was caused by a rise in the viscosity of another ingredient.
The reasoning went, "Every time the viscosity of this ingredient rises, the other ingredient flow stops!" This diagnosis was doubtful, however, because the delivery systems for the two ingredients were wholly independent. Further investigation revealed the ingredient flow disruption was caused by a blockage elsewhere in the system. Although logic indicated the viscosity of one material had no impact on the flow of the other, the sequence of events led the group to the fallacious conclusion that it did. The key to avoiding this fallacy is the old adage, "Don't jump to conclusions."
Insufficient statistics/hasty generalization. The same adage applies to avoiding the fallacy of insufficient statistics (also called hasty generalization). This fallacy results from the attempt to draw definitive conclusions on the basis of an inadequate number of data points. The examples are innumerable: assuming a single instance of production trouble indicates a system problem; scheduling machine rebuilds at a certain interval because the first breakdown occurred at that interval; condemning a whole batch of parts because one is bad.
Sample size is a key indicator of the accuracy of statistical information. Generalizations should not be made unless they are based on a representative sampling of the population. Yet crucial decisions are made daily based on samples that clearly are not representative. Statistical process control and predictive maintenance methods are techniques that help avoid hasty generalizations. However, keeping this fallacy in mind is helpful in developing good critical thinking skills in general.
Affirming the consequent. Another common fallacy is known as affirming the consequent. Simply put, it is logic turned on its head, as this example demonstrates. Unless they are lubricated properly, most machines eventually break down. The backwards logic of this fallacy would reason: "If the machine isn't lubricated, it will break down. It broke down; therefore, it was not lubricated."
This conclusion may well be true. But it also may be false. There are an endless number of other reasons why the machine could have broken down. The proper chain of reasoning avoids confusing cause with effect: "If the machine isn't lubricated, it will break down. It wasn't lubricated; therefore, it broke down." Logical progressions should be viewed with care. Make certain your reasoning follows the proper logical order.
Argumentum ad ignorantium. Other fallacies occur when proper conclusions drawn from troubleshooting logic are debated with others. Argumentum ad ignorantium (literally, "argument from ignorance") is also popularly known as being asked to prove a negative -- or stated another way, to prove something isn't the case.
For example, a maintenance manager is asked to replace an expensive part on a malfunctioning machine because the problem can't be found and that part just might be the cause. Nobody can prove it isn't the cause, so let's replace it and see if that fixes the problem. There are times when this course of action might be the most reasonable. However, if the only reason supporting the action is failure to prove the negative, it isn't a convincing argument. It often wins because the answer "I don't know" is difficult. But if your budget is going to take the impact of the decision, remember that line of reasoning is not valid.
Straw man. Another debate fallacy is known as straw man. In this simple but effective fallacy, your opponent misrepresents your position by mis-stating it or exaggerating it. This sets up a completely different debate that your opponent can win.
For example, on the basis of quantitative evidence, you suggest a planned maintenance shutdown be postponed. A coworker who disagrees might ask, "So you think these machines can run forever and never be maintained?" Of course, this is not what is being suggested, but if you allow such an assertion to stand, you may find you've lost the debate. Let those involved know your true position -- and keep in mind your critical thinking skills.
Argumentum ad hominem. A last debate fallacy is one a professional should never use and never tolerate. The argumentum ad hominem (literally, "argument at the man") attacks an opponent's person instead of the argument. This fallacy takes several forms, including abusive personal criticism. It can easily become two-sided, with the responder falling prey to the tu quoque, or "you, too!" fallacy. A professional discussion that descends to this level should be a signal that the conversation is at an end, at least for now.
Assuming a technical professional is an inherently good logician is not completely wrong. Technical education by its nature conveys a good sense of logic. However, adept critical thinking requires deeper knowledge. Keeping these common fallacies in mind helps not only in an immediate troubleshooting process, but also in developing better logic skills in general.
Henry Ford said, "Thinking is the hardest work there is. That's why so few people engage in it." Good, critical thinking is even harder.
Jim Vinoski holds a B.S in mechanical engineering from Christian Brothers University, Memphis, TN. Direct questions about the article to him at 616-832-6227 or by e-mail at firstname.lastname@example.org.
Case Study Database
Get more exposure for your case study by uploading it to the Plant Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.
These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.
Click here to visit the Case Study Database and upload your case study.
Annual Salary Survey
In a year when manufacturing continued to lead the economic rebound, it makes sense that plant manager bonuses rebounded. Plant Engineering’s annual Salary Survey shows both wages and bonuses rose in 2012 after a retreat the year before.
Average salary across all job titles for plant floor management rose 3.5% to $95,446, and bonus compensation jumped to $15,162, a 4.2% increase from the 2010 level and double the 2011 total, which showed a sharp drop in bonus.