Why We Believe — Even When It Makes No Sense
The Mind’s Hidden Shortcuts: How Belief Builds Itself
Most of the time, when we talk about being fooled, we picture someone else doing the tricking — a scammer, a con artist, a political spin doctor.
But here’s the uncomfortable truth: sometimes the mind fools itself — and it is very good at it.
Not because we are stupid, weak, or gullible. Far from it. The same mental habits that help us survive in a noisy, unpredictable world can also lead us into dead ends. We spot patterns. We make quick judgments. We protect what feels important. And when those instincts get triggered in just the right way, they can lock us into a belief so tightly that facts barely make a dent.
Flat Earth influencers, for example, understand this better than they admit. They do not have to “convince” someone in the traditional sense. All they need to do is feed what already feels true and let the mind do the rest. They speak to emotion first, then dress it up in logic. And because the conclusion feels like it came from inside your own head, it feels earned. When something feels earned, it feels worth defending.
This is not about flat versus round. It is about how belief takes root, why it can feel so unshakable, and how our own brains can quietly become part of the persuasion machine.
Our minds are wired for speed, not perfect accuracy. That is not a flaw — it is survival. But in the wrong hands, those shortcuts can be turned into traps.
One of those shortcuts is pattern-finding. We are natural pattern-seekers — it is how we learn and stay safe. But sometimes we connect dots that do not belong together. The horizon “looks” flat. A certain number keeps showing up. A coincidence feels like it must mean something. When something feels meaningful, we often stop asking if it really is.
Once we decide something is true, our brains start curating the evidence. This is confirmation bias — the tendency to notice and highlight what agrees with us while quietly ignoring what does not (Rokeach, 1968). Flat Earth channels thrive on it, feeding audiences a steady diet of “proof” that matches what they already suspect, each post or video a reassuring nod that they are on the right track.
Then there is the Dunning–Kruger effect — the gap between how much we think we know and how much we actually know. It can sneak up on anyone. When we only have a surface-level grasp of a topic, it can feel like we see the whole picture. The missing details — the tricky exceptions, the messy complexity — are invisible to us, so we do not even realize they are missing. That is why a neat slogan or a simple diagram can feel so convincing. It fits the version of the topic we think we understand, even if reality is far more complicated.
And perhaps the most quietly dangerous: the illusory truth effect. Say something often enough and it starts to sound true, even when it is not (Arkes, Boehm, & Xu, 1991; Hasher, Goldstein, & Toppino, 1977; Unkelbach & Greifeneder, 2018). Inside an echo chamber, repetition is not just common — it is the point. Familiar phrases like “Water finds its level” become anchors. They stick, not because they are accurate, but because they are familiar.
Persuasion works best when it taps into these mental habits we all share. The tactics that pull on these levers are often subtle but powerful. Social proof — showing that “everyone” believes something — makes it feel safer to agree (Cialdini, 2001). Identity fusion — blending belief with self-worth — makes doubt feel like betrayal (Tajfel & Turner, 1979; Van Zomeren, Postmes, & Spears, 2008). The illusion of mastery — simplifying complexity into neat diagrams or slogans — creates a satisfying sense of understanding. And rejecting outside authority in advance turns scientists, journalists, or educators into untrustworthy “outsiders” before they can even speak (Herman & Chomsky, 1988).
This is why certain narratives feel like they stick no matter how many counterarguments are offered. They are not winning because the evidence is stronger. They are winning because the message is built to match the way the human brain naturally works.
That is why this matters. None of us are immune to these shortcuts. They are not signs of weakness. They are part of how we think, decide, and make sense of the world. The danger comes when someone knows how to pull those levers and uses them to lock us into an idea — whether it is true or not.
The more we recognize those levers — in the media we consume, in the arguments we hear, and even in our own thinking — the harder they are to pull without our noticing. And the moment we notice, we start to reclaim control over what we believe and why.
Stay skeptical. Stay curious. And remember — see the patterns, spot the levers, keep your balance.
References:
Arkes, H. R., Boehm, L. E., & Xu, G. (1991). Determinants of judged validity. Journal of Experimental Social Psychology, 27(6), 576-607.
Cialdini, R. B. (2001). Influence: Science and practice (4th ed.). Allyn and Bacon.
Entman, R. M. (1993). Framing: Toward clarification of a fractured paradigm. Journal of Communication, 43(4), 51–58.
Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.
Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of Verbal Learning & Verbal Behavior, 16(1), 107-112.
Herman, E. S., & Chomsky, N. (1988). Manufacturing consent: The political economy of the mass media. Pantheon Books.
Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion: Central and peripheral routes to attitude change. Springer.
Rokeach, M. (1968). Beliefs, attitudes, and values: A theory of organization and change. Jossey-Bass.
Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. The social psychology of intergroup relations, 33, 47.
Unkelbach, C., & Greifeneder, R. (2018). Experiential fluency and declarative advice jointly inform judgments of truth. Journal of Experimental Social Psychology, 76, 162–171.
Van Zomeren, M., Postmes, T., & Spears, R. (2008). Toward an integrative social identity model of collective action: A quantitative research synthesis of predictors of collective action. Psychological Bulletin, 134(4), 504–535.
