Empowering Minds
Download a PDF of this article
Critical Thinking and Skepticism, the Foundation of Individual Rights
In an age saturated with information, the ability to think clearly and critically is more vital than ever. Critical thinking uses objective analysis and evaluation of an issue to reach a reasoned judgment. Conclusions must logically follow from premises that have been verified in turn.
Critical thinking does not exist in a vacuum but is intertwined with skepticism, a questioning attitude toward knowledge claims, and empiricism, the idea that knowledge originates primarily from sensory experience. These approaches are not merely intellectual exercises but have historically served to advance and protect individual rights, counterbalance superstition, and resist appeals to authority.
In the United States, our education system and culture are failing to teach critical thinking skills or even the importance of critical thinking necessary for people to be informed, responsible citizens of a democratic government.
The historical impact of challenging established beliefs is evident in the Enlightenment, a period marked by an emphasis on reason and individual thought, which saw thinkers championing the power of human reason over traditional authority. Immanuel Kant's call to "sapere aude" ('dare to know') encouraged individuals to use their understanding without the guidance of others. This undermined the authority of those who claimed exclusive access to truth, such as monarchs and religious officials. Pierre Bayle, a key figure of the Enlightenment, exemplified this skeptical spirit by fearlessly questioning religious, metaphysical, and scientific dogmas in his Historical and Critical Dictionary. His work exerted a radical, liberating influence, encouraging the examination of underlying assumptions. This intellectual ferment, fueled by doubt and skepticism towards existing power structures, made way for articulating individual liberty, representative government, the rule of law, and religious freedom as central doctrines.
The scientific revolution, which preceded and influenced the Enlightenment, provided a model for rational inquiry based on observation and empirical evidence. Figures like Newton, Bacon, and Locke, whom Jefferson considered the "three greatest men that ever lived," championed approaches to knowledge grounded in experience rather than unquestioned authority. This shift towards empiricism meant that claims of knowledge, whether about nature or society, were subjected to increasing scrutiny and higher standards of evidence. According to the Oxford English Dictionary, empiricism is "Primary reliance on evidence derived from observation, investigation, or experiment rather than on abstract reasoning, theoretical analysis, or speculation" (OED, n.d.). Skepticism's role in changing classical medical practices to modern ones based on empirical evidence demonstrates how a skeptical, evidence-based approach leads to more effective practices.
Here are some tips to guide you toward being a more effective critical thinker:
- Question Assumptions: Just as Bayle questioned established dogmas, make it a habit to identify and scrutinize the assumptions that underpin your own beliefs and the claims you encounter. Asking "How do I know this is true?" about new information helps avoid accepting claims at face value and builds the ability to evaluate information critically.
- Seek Different Perspectives: Before forming an opinion, actively try to understand multiple viewpoints on an issue. Ask yourself, "What might someone who disagrees with this think?" This expands your understanding and helps you see the strengths and limitations of different arguments.
- Explain Your Reasoning: Challenge yourself to explain your reasoning step-by-step. This helps you identify gaps in your thinking and strengthens your ability to build logical arguments.
- Utilize the "Baloney Detection Kit,” as described by Carl Sagan in his book, The Demon-haunted World: Science as a Candle in the Dark (1995). Employ a set of practical tools for skeptical thinking. This includes demanding independent confirmation of facts, encouraging substantive debate from experts with different viewpoints, rejecting arguments from authority without independent evidence, generating multiple hypotheses, and developing systematic tests to disprove alternative explanations.
- Engage "System 2" Thinking: The behavioral economist Daniel Kahneman described two modes of human thought based on his research in his book Thinking Fast and Slow (2011). He refers to these as System 1 and System 2. System 1 is fast, intuitive, and automatic and reaches conclusions quickly by using heuristics and mental shortcuts. It is more prone to biases but serves to make quick survival decisions. System 2 is slower, more deliberate, and more analytical. This is how we can perform complex reasoning. This mode is used when you methodically employ logical, rational analysis. It also requires more energy and is less automatic. System 1 often "hijacks" the decision-making process. It requires deliberate effort to engage System 2. Most people default to System 1 thinking to conserve mental energy. Critical thinkers must learn to recognize when intuitive responses dominate and intentionally and consciously engage in a deliberate, analytical reasoning process.
Effective critical thinking also involves recognizing and avoiding logical fallacies and flaws in reasoning that weaken or invalidate an argument. For example, the ad hominem fallacy attacks the person making the argument instead of the argument itself. Recognizing this fallacy allows you to focus on the substance of the claim rather than on irrelevant personal attacks. Similarly, the argument from authority claims something is true simply because an expert or person in power says so without providing independent evidence. While expert opinions can be valuable, critical thinkers understand the importance of independent confirmation.
Being aware of rhetorical techniques is also crucial for navigating persuasive messages. These techniques are used to influence beliefs and behaviors and are often used manipulatively. Propaganda, for instance, uses biased or misleading information to promote a particular viewpoint. Understanding how propaganda works allows you to critically evaluate the information presented and resist manipulation. Another common technique is the use of loaded language, which employs emotionally charged words to evoke a specific feeling or opinion, often bypassing rational analysis. Recognizing loaded language helps you to focus on the factual content of a message rather than being swayed by emotional appeals.
The historical struggle for individual rights is inextricably linked to the willingness of individuals to doubt established norms, to demand evidence, and to think for themselves. The Enlightenment’s challenge to absolute monarchy and the advocacy for individual rights were fueled by skeptical thinkers who questioned the divine right of kings and championed the idea of government by the consent of the governed, as articulated by John Locke in 1689 in his "Two Treatises of Government." The fight for religious tolerance, championed by figures like Voltaire, emerged from a context where individuals began to question the authority of established churches and demand the freedom to practice their own beliefs. These transformative shifts were not granted by benevolent authorities; they were won through the persistent application of reason, skepticism, and a commitment to evidence-based arguments.
Carl Sagan recommended including some specific tools in his kit.
- Having an independent confirmation of the “facts.”
- Encouraging substantial debate.
- Discounting arguments from authority.
- Starting with more than a single hypothesis to consider.
- Avoiding emotional investment in a particular hypothesis.
- Quantifying values from objective measurements rather than subjective opinions.
- Realizing every link in a chain of argument must hold for the reasoning to be valid.
- Using Occam's razor (Duignan, Encyclopedia Britannica, 2025) by choosing the simpler explanation when two have the same explanatory power.
- And asking whether a hypothesis is falsifiable, meaning it has implications that could be disproved by reliable evidence or experiment.
Developing strong critical thinking skills, embracing a healthy dose of skepticism, and valuing empirical evidence are not just academic pursuits; they are essential tools for personal empowerment and the protection of individual rights. By learning to question assumptions, seek diverse perspectives, understand logical fallacies, and recognize rhetorical techniques, you equip yourself to navigate the complexities of the modern world and contribute to a more just and reasoned society. Just as the thinkers of the Enlightenment used these tools to challenge tyranny and advocate for freedom, we too can use them to uphold and expand the rights of all individuals. The journey of critical thinking is a continuous one, demanding ongoing effort and a commitment to the "difficult art of thought," but it is a journey that ultimately leads to greater understanding, autonomy, and a more informed and empowered citizenry.
I will close this out with a quote from Carl Sagan's book I already mentioned that is painfully pertinent to our time and predicament.
We've arranged a global civilization in which the most crucial elements — transportation, communications, and all other industries; agriculture, medicine, education, entertainment, protecting the environment; and even the key democratic institution of voting — profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.
Appendix
This list is based on Carl Sagan’s book The Demon-Haunted World in the chapter named “The Fine Art of Baloney Detection,” which was expanded from an earlier article published by Sagan under the same title.
The full “kit” is available at https://centerforinquiry.org/learning-resources/carl-sagans-baloney-detection-kit/ courtesy of the Center for Inquiry.
Logical Fallacies
It is important to avoid using these and to recognize when others are resorting to them, intentionally or not.
- Ad Hominem Attacks: Attacking the character, motive, or personal traits of an individual making an argument instead of addressing the substance of the argument itself. Example: "You can't trust her opinion on climate change because she drives an SUV."
- Argument from Authority: Claiming something is true because an expert or respected person says it is, without providing independent evidence. Example: "This doctor says this supplement works, so it must be effective."
- Argument from Adverse Consequences: Rejecting an argument or claim because its implications are uncomfortable or undesirable. Example: "We can't accept that climate change is real because it would mean completely changing our lifestyle."
- Appeal to Ignorance: Arguing that a claim must be true because it hasn't been proven false, or vice versa. Example: "No one has proven ghosts don't exist, so they must be real."
- Special Pleading: Creating arbitrary exceptions to a general rule to protect a preferred belief. Example: "All scientific studies are reliable, except the ones that contradict my beliefs."
- Begging the Question (Assuming the Answer): Making an argument where the conclusion is assumed in the premise. Example: "This book is great because it's the best book ever."
- Observational Selection: Focusing only on evidence that supports a predetermined conclusion while ignoring contradictory evidence. Example: A psychic highlighting successful predictions while ignoring numerous failed predictions.
- Statistics of Small Numbers: Drawing broad conclusions from a very limited sample size. Example: "I know two smokers who lived to 100, so smoking can't be that bad."
- Misunderstanding of the Nature of Statistics: Misinterpreting or misrepresenting statistical data to support an argument. Example: Claiming a 2% increase represents a dramatic trend.
- Non Sequitur: A conclusion that does not logically follow from the premises. Example: "I'm a good driver, so I would make a great pilot."
- Post Hoc, Ergo Propter Hoc: Assuming that because one event followed another, the first event caused the second. Example: "I wore my lucky socks and we won the game, so my socks must bring luck."
- Meaningless Question: Asking a question that cannot be meaningfully answered or is inherently nonsensical. Example: "What color is the sound of silence?"
- Excluded Middle (False Dichotomy): Presenting only two options when more alternatives exist. Example: "You're either with us or against us."
- Short-term vs. long-term: Failing to consider the long-term consequences of an action while focusing solely on immediate results. Example: Cutting education funding to save money now, ignoring future economic impacts.
- Slippery Slope: Arguing that a relatively small first step will inevitably lead to a chain of related events, typically catastrophic. Example: "If we allow same-sex marriage, next people will want to marry animals."
- Confusion of Correlation and Causation: Assuming that because two things are correlated, one must cause the other. Example: "Ice cream sales increase in summer, and crime rates also rise, so ice cream causes crime."
- Straw Man: Misrepresenting or oversimplifying an opponent's argument to make it easier to attack. Example: "Environmentalists want to destroy the economy" when they propose sustainable economic practices.
- Suppressed Evidence (Half-Truth): Presenting only part of the information while deliberately concealing evidence that might contradict the conclusion. Example: Citing a study's positive results while omitting its limitations or contradictory findings.
- Weasel Words: Using ambiguous or misleading language to appear to make a point without actually committing to it. Example: "Some people say..." or "Studies suggest..." without citing specific sources.
Cognitive Biases
In addition to the “baloney kit” components provided by Sagan, it is also important to understand and work to avoid the effects of unconscious cognitive biases that have been shown to affect our ability (all of us humans) to think objectively.
- Confirmation Bias: The tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People actively avoid information that contradicts their existing beliefs and preferentially seek out confirmatory information.
- Anchoring Bias: The human tendency to rely too heavily on the first piece of information encountered (the "anchor") when making decisions. This initial information disproportionately influences subsequent judgments, even if the anchor is arbitrary or irrelevant.
- Availability Heuristic: Overestimating the likelihood of events with greater "availability" in memory. People judge the probability of an event based on how easily they can recall similar instances, which often leads to skewed risk assessments.
- Dunning-Kruger Effect: A cognitive bias where people with limited knowledge or competence in a given intellectual or social domain greatly overestimate their own knowledge or competence in that domain.
- Fundamental Attribution Error: The tendency to overemphasize dispositional or personality-based explanations for behaviors while underemphasizing situational explanations. People assume others' actions reflect their character while explaining their own actions through external circumstances.
- Survivorship Bias: Concentrating on the people or things that "survived" some process and inadvertently overlooking those that did not because of their lack of visibility. This leads to overly optimistic beliefs because failures are ignored.
- Sunk Cost Fallacy: Continuing a behavior or endeavor as a result of previously invested resources (time, money, effort). People struggle to rationally abandon a strategy because they've already invested in it.
- Bandwagon Effect: The tendency to do or believe things because many other people do, regardless of the underlying logic or evidence. People are more likely to adopt beliefs that are popular within their social groups.
- Negativity Bias: The psychological phenomenon by which humans give more weight to and pay more attention to negative experiences over neutral or positive experiences. Bad news and negative information impact us more strongly than good news.
- Recency Illusion: The belief that recent events or trends are more significant than historical patterns. People tend to overvalue recent information and undervalue long-term trends.
- Blind Spot Bias: The inability to recognize one's own cognitive biases while being able to spot them in others. This meta-bias prevents individuals from improving their critical thinking.
- Projection Bias: Unconsciously assuming that others think, feel, or would react the same way you would in a given situation. This leads to misunderstandings and incorrect predictions about others' behaviors.
- Status Quo Bias: The preference for the current state of affairs. People tend to resist change and prefer things to stay the same, even when change might be beneficial.
- Self-Serving Bias: The tendency to attribute positive events to one's own character but attribute negative events to external factors. People take credit for successes and blame circumstances for failures.
- Hindsight Bias: The tendency to perceive past events as having been more predictable than they were. After an event occurs, people believe they would have predicted or expected the outcome.
- Clustering Illusion: The tendency to see patterns where none exist, especially in random data. People are prone to finding meaningful connections in completely coincidental information.
- Optimism Bias: The tendency to overestimate the likelihood of positive events and underestimate the likelihood of negative events. People believe they are less likely to experience negative outcomes.
- Authority Bias: The tendency to attribute greater accuracy to the opinion of an authority figure and be more influenced by that opinion, regardless of its actual validity.
References
Bayle, P. (1826). An Historical and Critical Dictionary. United Kingdom: Hunt and Clarke.
Duignan, B. (2025, March 20). Occam’s razor. Encyclopedia Britannica. https://www.britannica.com/topic/Occams-razor
Jefferson, T. A letter from Thomas Jefferson to John Trumbull, https://www.loc.gov/exhibits/jefferson/18.html. Library of Congress. Retrieved Apr 4, 2025.
Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
Locke, J. (1988). Locke: Two Treatises of Government (P. Laslett, Ed.). Cambridge: Cambridge University Press. https://doi.org/10.1017/cbo9780511810268 Available from https://standardebooks.org/ebooks/john-locke/two-treatises-of-government.
Oxford University Press. (n.d.). Empiricism, n., 6. In the Oxford English Dictionary. Retrieved April 4, 2025, from https://doi.org/10.1093/OED/2773008195
Sagan, C. (1987, February). Carl Sagan’s Baloney Detection Kit. Center for Inquiry. Retrieved April 4, 2025, from https://centerforinquiry.org/learning-resources/carl-sagans-baloney-detection-kit/ Originally published as “The Fine Art of Baloney Detection: How Not to Be Fooled,” PARADE Magazine, February 1, 1987.
Sagan, C. (1995). The Demon-Haunted World: Science as a Candle in the Dark. Ballantine Books.