Logic and plant engineering: become a more effective problem-solver by avoiding these common fallacies

Troubleshooting problems in plant equipment and systems requires a flawlessly logical thought process.

By James M. Vinoski January 1, 1998

Troubleshooting problems in plant equipment and systems requires a flawlessly logical thought process. Like many of the skills needed in the world of plant operations and management, logic is something maintenance managers and facilities engineers are just assumed to have. Yet good logic is a skill that rarely comes naturally.

Lack of training in critical thinking leaves us all the more prone to errors in logic, which are known as fallacies. Only by learning common fallacies and avoiding them in our day-to-day thought processes can we develop the critical thinking skills we need to be effective problem-solvers.

Post hoc, ergo propter hoc. One of the most common fallacies in the plant environment is post hoc, ergo propter hoc, or after this, therefore because of this. This fallacy assumes that because one event occurs just prior to another, the first event caused the second. In one recent case, an operations group asked for help solving a problem with the disruption of the flow of one ingredient in a process. The group felt the problem was caused by a rise in the viscosity of another ingredient.

The reasoning went, “Every time the viscosity of this ingredient rises, the other ingredient flow stops!” This diagnosis was doubtful, however, because the delivery systems for the two ingredients were wholly independent. Further investigation revealed the ingredient flow disruption was caused by a blockage elsewhere in the system. Although logic indicated the viscosity of one material had no impact on the flow of the other, the sequence of events led the group to the fallacious conclusion that it did. The key to avoiding this fallacy is the old adage, “Don’t jump to conclusions.”

Insufficient statistics/hasty generalization. The same adage applies to avoiding the fallacy of insufficient statistics (also called hasty generalization). This fallacy results from the attempt to draw definitive conclusions on the basis of an inadequate number of data points. The examples are innumerable: assuming a single instance of production trouble indicates a system problem; scheduling machine rebuilds at a certain interval because the first breakdown occurred at that interval; condemning a whole batch of parts because one is bad.

Sample size is a key indicator of the accuracy of statistical information. Generalizations should not be made unless they are based on a representative sampling of the population. Yet crucial decisions are made daily based on samples that clearly are not representative. Statistical process control and predictive maintenance methods are techniques that help avoid hasty generalizations. However, keeping this fallacy in mind is helpful in developing good critical thinking skills in general.

Affirming the consequent. Another common fallacy is known as affirming the consequent. Simply put, it is logic turned on its head, as this example demonstrates. Unless they are lubricated properly, most machines eventually break down. The backwards logic of this fallacy would reason: “If the machine isn’t lubricated, it will break down. It broke down; therefore, it was not lubricated.”

This conclusion may well be true. But it also may be false. There are an endless number of other reasons why the machine could have broken down. The proper chain of reasoning avoids confusing cause with effect: “If the machine isn’t lubricated, it will break down. It wasn’t lubricated; therefore, it broke down.” Logical progressions should be viewed with care. Make certain your reasoning follows the proper logical order.

Argumentum ad ignorantium. Other fallacies occur when proper conclusions drawn from troubleshooting logic are debated with others. Argumentum ad ignorantium (literally, “argument from ignorance”) is also popularly known as being asked to prove a negative — or stated another way, to prove something isn’t the case.

For example, a maintenance manager is asked to replace an expensive part on a malfunctioning machine because the problem can’t be found and that part just might be the cause. Nobody can prove it isn’t the cause, so let’s replace it and see if that fixes the problem. There are times when this course of action might be the most reasonable. However, if the only reason supporting the action is failure to prove the negative, it isn’t a convincing argument. It often wins because the answer “I don’t know” is difficult. But if your budget is going to take the impact of the decision, remember that line of reasoning is not valid.

Straw man. Another debate fallacy is known as straw man. In this simple but effective fallacy, your opponent misrepresents your position by mis-stating it or exaggerating it. This sets up a completely different debate that your opponent can win.

For example, on the basis of quantitative evidence, you suggest a planned maintenance shutdown be postponed. A coworker who disagrees might ask, “So you think these machines can run forever and never be maintained?” Of course, this is not what is being suggested, but if you allow such an assertion to stand, you may find you’ve lost the debate. Let those involved know your true position — and keep in mind your critical thinking skills.

Argumentum ad hominem. A last debate fallacy is one a professional should never use and never tolerate. The argumentum ad hominem (literally, “argument at the man”) attacks an opponent’s person instead of the argument. This fallacy takes several forms, including abusive personal criticism. It can easily become two-sided, with the responder falling prey to the tu quoque, or “you, too!” fallacy. A professional discussion that descends to this level should be a signal that the conversation is at an end, at least for now.

Assuming a technical professional is an inherently good logician is not completely wrong. Technical education by its nature conveys a good sense of logic. However, adept critical thinking requires deeper knowledge. Keeping these common fallacies in mind helps not only in an immediate troubleshooting process, but also in developing better logic skills in general.

Henry Ford said, “Thinking is the hardest work there is. That’s why so few people engage in it.” Good, critical thinking is even harder.

Jim Vinoski holds a B.S in mechanical engineering from Christian Brothers University, Memphis, TN. Direct questions about the article to him at 616-832-6227 or by e-mail at vinoski@michweb.net.