Psychological Blindspots in Reasoning: Improving Our Critical Thinking, By Professor X

The American Thinker piece (March 25, 2026, by Robert Arvay) is a short, reflective blog post using the metaphor of visual blind spots (the small area in each eye where the optic nerve connects, creating a gap we don't notice because the other eye compensates) to explain psychological blind spots in reasoning.

Core Idea

Even highly intelligent, accomplished, ethical, or "brilliant" people, like Professor X of course (!, joke), can be dead wrong on specific issues — not because they're stupid, dishonest, or stubborn in the ordinary sense, but because they have an invisible cognitive gap that prevents them from perceiving the flaw in their own logic. The author can't see his own blind spots (by definition), and smarter people are no exception. The remedy? Humility, self-scrutiny, and openness to external perspectives that "fill in" the gap, much like binocular vision.

Key Examples from the Article

Arvay's former boss: A smart, successful manager who rolled out a policy that clearly harmed the department. When challenged with a better alternative, the boss dismissed it with seemingly rational (but flawed) arguments and became permanently convinced of his own correctness. Higher-ups later overruled him, the alternative worked better, yet the boss still insisted his original idea was superior. It wasn't ego or malice — the boss genuinely couldn't "see" the obvious problem due to a psychological blind spot.

Online debate with another brilliant person: They agreed on big-picture conclusions (e.g., scepticism toward strict physicalist/materialist explanations of mind/consciousness/evolution, and criticism of academic corruption driven by funding). But they diverged sharply on the reasoning paths. The author couldn't get the other person to recognize the holes in their logic, despite aligned outcomes.

The piece speculates these blind spots are common and affect not just personal or workplace decisions, but broader public discourse, science, and national policy. Smart people aren't immune; expertise in one area doesn't guarantee clear vision everywhere.

Why This Fits the Pattern the Alor.org blog has been discussing

Diet mania (Mercola on high-fat/keto risks): "Experts" and influencers on both sides — low-fat establishment vs. high-fat gurus—can miss trade-offs, individual variation, or long-term data due to ideological or incentive-driven blind spots. Short-term wins (weight loss) blind some to metabolic flexibility costs; others cling to old low-fat dogma despite obesity epidemics.

Conspiracism (Counter-Currents): Over-reliance on grand-cabal explanations can create a blind spot to mundane realities like incentives, cultural consensus, incompetence, and emergent patterns. The flip side — naive institutional trust — has its own massive blind spots (e.g., ignoring real coordination or capture). A sane view needs both agency and limits on planning/foresight.

In all cases, smart, credentialed, or ideologically committed people get things wrong because intelligence, education, or success can amplify certain biases:

Blind spot to one's own errors: High IQ often correlates with better rationalisation of preferred conclusions (the "intelligence trap" or "bias blind spot" documented in psychology research — smarter people sometimes detect biases in others more easily but exempt themselves).

Domain expertise trap: Deep knowledge in one field can create overconfidence when applied elsewhere, or tunnel vision that ignores interdisciplinary factors.

Incentive and social alignment: Funding, status, peer groups, or worldview commitments (materialism in science, profit in diets, power in politics) shape what one "sees."

Motivated reasoning: People (smart or not) are excellent at finding evidence for what they want to believe and overlooking contradictions.

This isn't anti-intelligence. Raw smarts help process information, spot patterns, and build arguments. But it doesn't automatically confer wisdom, humility, or rationality. As researchers like Keith Stanovich have argued, intelligence (as measured by IQ/tests) is not the same as rationality — the ability to override biases, update beliefs with new evidence, and avoid myside bias. Education can even worsen polarisation or overconfidence in some contexts.

A Balanced Take: Smart People and Regular Folks Get It Wrong

Arvay's humility is refreshing: "Even I can be right" sometimes, and we shouldn't outsource thinking entirely to "people smarter than me." At the same time, dismissing expertise wholesale is its own blind spot — plumbers, engineers, or statisticians often outperform lay opinions in their domains for good reason. The healthy stance is epistemic humility:

Test claims against evidence, outcomes, and trade-offs.

Seek disconfirming data and diverse viewpoints (the "other eye").

Recognize that history, science, and policy are littered with confident expert failures: dietary guidelines that fuelled metabolic disease, economic models that missed crashes, public health responses with massive unintended costs, ideological certainties that aged poorly.

Incentives matter: Academia, media, government, and industries reward certain narratives. Groupthink and career risks create collective blind spots.

Tying back to "getting back to basics" from your diet query: In thinking as in eating, extremes and manias thrive on overconfidence. Humans aren't omniscient. Our ancestors survived without PhDs or grand theories by observing what actually worked in their environment, adapting pragmatically, and learning from mistakes — often through trial, community input, and direct feedback from reality.

Smart people err for the same human reasons we all do: limited perspective, emotional investment, social pressures. The antidote isn't anti-expert populism or blind deference — it's rigorous, evidence-based scepticism combined with intellectual honesty. Question everything, including your own sacred cows, and update when the data (or results) demand it.

This piece is a gentle reminder against hero-worship of intellectuals or authorities. It aligns with a realist view: power, ideas, and decisions involve real agency and planning, but also pervasive human fallibility — no human has a God's-eye view.

https://www.americanthinker.com/blog/2026/03/why_people_smarter_than_me_are_sometimes_wrong.html