Why Your Brain Lies to You

You are not as rational as you think you are.

Neither am I. Neither is anyone. This isn’t being negative - it’s just how brains work. Our brains grew to make fast choices, not accurate ones. Speed kept our ancestors alive. Accuracy was a nice-to-have when something was moving in the grass nearby.

The result is a set of mental shortcuts called cognitive biases. They follow patterns. They’re predictable. And they’re running in your head right now.

What Are Cognitive Biases?

Cognitive biases aren’t random mistakes. They’re patterned mistakes that almost everyone makes in the same situations. Scientists can trigger them in tests with near-perfect results every time.

Think of them as features, not bugs. They made sense in the world our ancestors lived in. The problem is that your brain is running ancient software in a modern world.

Why Do We Have Them?

Imagine you’re an early human. You hear a rustle in the tall grass. Two possible responses:

Response A: “That’s probably just the wind. Let me carefully gather more data before reacting.”

Response B: “PREDATOR! RUN!”

If it’s actually the wind, Response B wastes some energy. You look foolish. No lasting damage.

If it’s actually a lion, Response A gets you removed from the gene pool.

Over thousands of years, the careful, slow-thinking humans got eaten. The jumpy, quick-to-react humans lived long enough to have kids. We’re descended from the jumpy ones. Evolution doesn’t care about accuracy - it cares about staying alive.

This is why your brain tends to:

  • Assume threats are real (even when they’re not)
  • See patterns (even where none exist)
  • Make snap judgments (even when you have time to think)
  • Prefer familiar things (even when new things are better)

These shortcuts kept your ancestors alive long enough to have children. They just don’t always help when you’re trying to make good choices about your career, relationships, or whether to click on that news article designed to make you angry.

The Big Ones

Here are some cognitive biases you’ll recognize. Yes, in yourself. Don’t pretend you’re immune.

Confirmation Bias

You notice evidence that supports what you already believe. You dismiss, forget, or explain away evidence that contradicts it.

Example: You think your coworker is lazy. You notice every time they take a break or leave early. You don’t notice the times they stay late or help others-or you explain those away as exceptions.

This one is sneaky because you don’t know you’re doing it. Your brain filters what you see before you’re even aware of it. You truly believe you’re being fair.

Availability Heuristic

You judge probability based on how easily examples come to mind, not actual statistics.

Example: After seeing news about plane crashes, you feel flying is dangerous - even though your drive to the airport is far more risky. Plane crashes are rare but memorable. Car crashes are common but boring to report, so your brain doesn’t flag them as threats.

This is why 24-hour news is bad for your brain. It fills your head with vivid examples of rare disasters. This makes the world feel more dangerous than it really is.

Anchoring

The first number you hear influences your judgment, even when that number is completely arbitrary.

Example: A store marks a shirt as “$100, now $50.” You think it’s a deal, even if the shirt was never worth $100-even if no one in history has ever paid $100 for that shirt. The anchor has done its job.

Negotiations exploit this constantly. Whoever sets the first number controls the frame.

Fundamental Attribution Error

You attribute others’ behavior to their character. You attribute your own behavior to circumstances.

Example:

  • Someone cuts you off in traffic → “What a jerk.”
  • You cut someone off → “I’m late for an important meeting.”

Same behavior. Completely different explanations. And you’ll do this automatically, without noticing the inconsistency.

Dunning-Kruger Effect

The less you know about something, the more confident you feel. As you learn more, you realize how much you don’t know.

Example: After reading one article about economics, you feel ready to debate economists. After studying economics for years, you understand why economists are always hedging their predictions.

Peak confidence often means peak ignorance. This is hard to accept, but it happens again and again.

Sunk Cost Fallacy

You continue investing in something because of what you’ve already invested, not because it makes sense going forward.

Example: You keep watching a terrible movie because you’ve already watched an hour. But that hour is gone either way-the only question is whether to waste another hour on top of it.

This bias ruins careers, relationships, and businesses. “But I’ve already put so much into this” is not a reason to continue. What you spent in the past doesn’t change whether spending more makes sense now.

In-Group Bias

You favor people in your group and view outsiders with suspicion.

Example: Your team’s mistakes are understandable-complex circumstances, external pressures, bad luck. The other team’s mistakes prove they’re incompetent. Same mistakes, different tribe, different interpretation.

Why This Matters

These biases aren’t just fun facts to mention at parties. They affect real choices with real results:

  • Confirmation bias keeps you stuck in wrong beliefs indefinitely
  • Availability heuristic makes you afraid of the wrong things while ignoring actual risks
  • Anchoring costs you money in negotiations you didn’t even realize were negotiations
  • Attribution error damages relationships by making everyone else seem malicious or incompetent
  • Sunk cost fallacy keeps you in bad jobs, bad relationships, and bad investments long past the point where you should have walked away
  • In-group bias creates unnecessary conflict and blinds you to your own tribe’s failures

This isn’t abstract. These biases are operating in your decision-making right now.

The Good News

Knowing about biases doesn’t make them go away. Your brain still takes these shortcuts on its own - that’s how it’s built. You can’t uninstall the software.

But knowing helps in two ways:

1. You can pause before big choices. When it matters, you can slow down and ask: “What bias might be at work here?” This won’t always catch it, but it helps a lot.

2. You can build systems that help. Checklists. Asking people who disagree with you. Rules that force you to think about other options. Waiting a day before major choices. These tools help fight your brain’s default shortcuts.

A Humbling Exercise

Think of a belief you hold strongly. Something you’re confident about. Now ask yourself:

  • What evidence would change my mind?
  • When was the last time I actively looked for that evidence?
  • If I’m honest, am I more interested in being right or in finding the truth?

If you can’t think of anything that would change your mind, that’s a warning sign. It means your belief isn’t based on evidence at all - it’s based on who you are, how you feel, or what your group believes. You’re not thinking; you’re defending.

That doesn’t mean the belief is wrong. But it does mean you’re not holding it because you looked at the facts and followed them.

Most people never do this exercise. They just assume their beliefs make sense because… they’re their beliefs. That’s not how this works.

What’s Next

Part 1 covers how to test your beliefs using ideas from science. Not in a lab with fancy equipment - in your daily life, with the choices you actually make.

The goal isn’t to become a perfect thinking machine. That’s not possible, and it doesn’t sound fun anyway. The goal is to catch yourself when your brain’s shortcuts are leading you wrong. Even catching yourself 10% of the time is a big improvement. For most people, that number is close to never.


Enjoyed This?

If this helped something click, subscribe to my YouTube channel. More content like this, same approach - making things stick without insulting your intelligence. It’s free, it helps more people find this stuff, and it tells me what’s worth making more of.