Ethics and Critical Thinking
The Logical Fallacies That Get People Killed
Every Marine who has ever sat through a safety brief has encountered the implicit argument that information prevents bad outcomes.
Know the rules. Know the risks. Know the standard. And the knowledge will produce the right behavior.
It will not. Not reliably. Not under pressure. Not when the cognitive load of the operational environment is competing with the cognitive load of the ethical decision simultaneously. Because the gap between knowing and doing is not only a values gap. It is a thinking gap. And the thinking gap is exploited by a specific set of cognitive errors that operate below the level of conscious awareness and produce catastrophic decisions in people who genuinely know better.
Those errors have names. And naming them is the first step toward preventing them from determining outcomes.
What critical thinking actually is
Critical thinking is not skepticism. It is not contrarianism. It is not the refusal to accept authority.
Critical thinking is the disciplined application of intellectual standards to the process of reasoning. It requires clarity about what is being claimed. Evidence for why the claim is true. Sound logic connecting the evidence to the conclusion. And the intellectual humility to recognize when the reasoning is weak regardless of how comfortable the conclusion feels.
For a Marine, critical thinking is an operational requirement because every consequential decision in the operational environment involves incomplete information, time pressure, emotional loading, and the constant possibility that the situation is not what it appears to be. The Marine who cannot think critically about their own reasoning is not just philosophically deficient. They are operationally dangerous.
The moral lens that ethics requires, the capacity to evaluate situations through the lens of values and determine what action those values demand, is a form of critical thinking. It cannot function reliably if the underlying reasoning process is being distorted by logical fallacies and cognitive biases that the Marine does not know how to identify.
The fallacies most likely to produce ethical failure
Logical fallacies are errors in reasoning. They are not lies. They are not deliberate deceptions. They are structural flaws in the thinking process that produce invalid conclusions from potentially valid premises. Every human being is susceptible to them. The ones most likely to produce ethical failure in military contexts are worth examining specifically.
The first is the appeal to authority. The conclusion must be right because a person of authority said it. The problem is not that authority is irrelevant. In military organizations authority carries genuine weight for good reasons. The problem is that authority does not determine truth. An order from a senior can be wrong. A policy from a headquarters can be unlawful. A cultural norm from a respected peer group can be deeply unethical. The Marine who accepts conclusions on the basis of authority alone without engaging the reasoning behind them is not exercising critical thinking. They are outsourcing their moral responsibility to whoever holds the highest rank in the room.
The second is the appeal to popularity. The conclusion must be right because everyone agrees with it. Everyone in the unit has normalized the conduct. Everyone in the platoon has accepted the standard as the real standard regardless of what the written standard says. The moral courage required to push back against what everyone in the unit accepts as normal is among the rarest and most important forms of courage in military service. The appeal to popularity makes that courage harder by creating the impression that dissent from the group norm is the deviant position rather than the ethical one.
The third is the false dilemma. Presenting only two options when more exist. Either we accomplish the mission or we follow the rules. Either we protect our Marines or we protect the civilians. Either we report the misconduct or we protect our fellow Marine. In almost every real ethical situation more options exist than the false dilemma presents. The Marine who is trained to recognize this fallacy is more likely to find the third option, the one that honors more of the competing values rather than requiring the complete abandonment of one.
The fourth is rationalization, which functions as a collection of fallacies deployed in service of a conclusion that has already been reached for non-rational reasons. The Marine who has already decided to do something they know is wrong will construct reasoning that makes it look like the conclusion follows from the values rather than contradicting them. Rationalization is particularly dangerous because it is the fallacy that most closely resembles genuine ethical reasoning. It uses the same language. It invokes the same values. It arrives at a conclusion that was predetermined by desire or fear rather than reason.
The fifth is the slippery slope. The argument that a small deviation from the standard now will inevitably lead to catastrophic deviation later and therefore the standard must be held absolutely with no exceptions. This fallacy cuts in both directions. It is used to prevent genuine exceptions that a reasonable values-based analysis would permit. And it is used conversely to justify small deviations on the grounds that since the catastrophic outcome is not immediate the deviation is acceptable. The moral culmination mechanism is in part the accumulation of small rationalizations that individually seem manageable and collectively produce catastrophic ethical failure.
The role of emotion in ethical reasoning
Critical thinking does not require the elimination of emotion from ethical decision making. It requires the disciplined management of emotion so that it informs rather than overrides the reasoning process.
Emotions carry genuine moral information. The feeling of disgust at witnessing something wrong is a signal worth attending to. The feeling of fear in the face of genuine danger is a survival mechanism worth respecting. The feeling of compassion for a suffering person is a moral prompt worth acting on.
The problem arises when emotion substitutes for reasoning rather than informing it. The Marine who acts on the rage produced by the loss of a brother without subjecting that rage to values-based analysis is not responding to moral information. They are being driven by an emotion that, as Jonathan Shay documented, can destroy the capacity for virtue when it crosses into the berserk state the moral culmination essay described.
Aristotle’s insight is relevant here. The virtuous person does not eliminate emotion from moral decision making. They develop the habit of reasoning through emotion rather than being driven by it. That habit is not natural. It is developed. It requires the same deliberate practice that any other complex skill requires. And it requires the intellectual humility to recognize when emotion is distorting the reasoning process even when the emotion feels entirely justified.
Reflection as the primary tool
The antidote to both logical fallacies and unmanaged emotion in ethical decision making is reflection. Not passive introspection. Deliberate structured examination of the reasoning process before action is taken.
Reflection asks the questions that fallacies prevent from being asked. What is the evidence for this conclusion? What other options exist beyond the two I am currently considering? Who or what is the authority behind this claim and is that authority reliable? Is this reasoning or rationalization? What emotion is operating right now and is it informing or distorting the analysis?
These questions cannot be asked effectively in the moment of crisis if they have never been practiced before the moment of crisis. Reflection is a skill. Like every skill it must be developed through deliberate practice under progressively more demanding conditions.
The leader who builds a culture of reflection, who consistently asks their Marines to examine their own reasoning before acting and after acting, is developing the critical thinking capacity that prevents logical fallacies from determining outcomes under pressure.
The leader who does not is leaving their Marines’ ethical decision making vulnerable to exactly the cognitive errors that have produced the most damaging institutional failures of the past five decades.
Humility as the precondition
Every one of the logical fallacies described in this essay is more powerful in the absence of intellectual humility. The Marine who is certain of their own reasoning cannot recognize when that reasoning is flawed. The Marine who is certain of their superior’s authority cannot recognize when that authority is being misused. The Marine who is certain of the group norm cannot recognize when the group is wrong.
Intellectual humility is not weakness. It is not the absence of conviction. It is the honest recognition that any individual’s reasoning process, including one’s own, is fallible. That fallibility does not disappear under pressure. It intensifies.
The Marine who approaches ethical decisions with genuine intellectual humility, who asks whether their reasoning is sound before acting rather than after, who genuinely considers the possibility that they are wrong, is not a less decisive Marine. They are a more reliable one. The confidence that follows genuine deliberation is more durable than the confidence that follows rationalization because it has been tested rather than assumed.
That is the standard critical thinking demands. It is more demanding than trusting instinct. It is more demanding than following authority. And it is more reliably productive of outcomes that honor the values that the institution and the individual both claim to hold.
References
Aristotle. Nicomachean Ethics. Translated by Terence Irwin. Hackett Publishing, Indianapolis, 1999.
Headquarters US Marine Corps. Warfighting. MCDP 1. Washington DC, 1997.
Katolin, Dennis W. Ethics in War: A Doctrine for the American Warfighter. Proposed Marine Corps Doctrinal Publication, 2016.
Shay, Jonathan. Achilles in Vietnam: Combat Trauma and the Undoing of Character. Touchstone, New York, 1994.
Zimbardo, Philip. The Lucifer Effect: Understanding How Good People Turn Evil. Random House Publishing, New York, 2007.

