Changing Your Mind
F. Scott Fitzgerald wrote that the test of a first-rate intelligence is holding two opposed ideas in mind simultaneously while retaining the ability to function.
I’d add: the test of intellectual honesty is changing your mind when evidence warrants it-without treating it as some kind of moral failure.
This is, apparently, a radical position.
Why Changing Your Mind Is Hard
Beliefs feel like who you are. When someone attacks your view on the economy, it doesn’t feel like they’re questioning your thinking - it feels like they’re attacking you. Your brain can’t tell the difference between “this idea is wrong” and “you are wrong.”
There’s also social punishment. People who change their minds get labeled flip-floppers. Unreliable. Wishy-washy. Watch any political campaign: consistency is rewarded regardless of whether the consistent position is correct. Society has decided that confident wrongness beats honest uncertainty.
And there’s brain cost. Changing a belief means rearranging everything connected to it. Your brain would rather ignore clashing facts than do the mental version of cleaning your closet. New evidence shows up and your brain goes “I’m sure that’s wrong somehow” and moves on.
All these forces push toward belief persistence-holding onto beliefs regardless of evidence.
Why Changing Your Mind Is Valuable
Reality doesn’t care about your beliefs. This is perhaps the most important sentence in this entire series. Whether you believe in gravity or not, you’ll fall if you step off a cliff. The universe is magnificently indifferent to your opinions about it.
Accurate beliefs help you navigate reality effectively. Inaccurate beliefs cause you to walk into walls. Every false belief you hold is:
- A wrong prediction waiting to surprise you
- A bad decision waiting to happen
- Energy spent defending something that isn’t true
Changing your mind when facts demand it isn’t flip-flopping. It’s fixing mistakes. It’s how you get less wrong over time. The other option - never changing - means being just as wrong at 80 as you were at 20. That’s a sad way to spend sixty years.
What Changes and What Doesn’t
Not all mind-changing is equal. This distinction matters.
Good mind-changing: Updating beliefs based on new evidence, better arguments, or recognized errors in your previous reasoning.
Bad mind-changing: Changing beliefs based on social pressure, temporary emotions, or whoever yelled at you most recently on the internet.
The goal isn’t to be easily swayed - that’s just being a leaf in the wind. The goal is to respond to good evidence and good arguments while ignoring bad ones. You want a mind that’s open but not empty.
Be Data-Driven, Not Opinion-Driven
Before we get into the mechanics of updating beliefs, let’s establish something fundamental: your beliefs should be based on data, not feelings.
I know. Feelings are very compelling. They arrive with such urgency and certainty. But feelings are not evidence. They’re signals that something is happening in your nervous system-useful information, but not proof of anything about external reality.
This connects to Part 1 (Testing Your Assumptions). The same scientific thinking that helps you test assumptions tells you when to change your mind:
First Principles Thinking:
- What do I actually know versus what do I merely assume?
- What’s the foundational evidence for this belief?
- If I started from scratch with no preconceptions, would I arrive at the same conclusion?
The Scientific Method Applied to Beliefs:
- State your current belief clearly (hypothesis)
- Ask: what evidence would prove me wrong? (falsifiability)
- Actively look for that evidence (experimentation)
- Update based on what you find (conclusion)
- Repeat forever (iteration)
Data Over Narrative:
- Anecdotes aren’t data. “I know someone who…” is not evidence. It’s a story.
- Feelings aren’t data. “It feels true” is not evidence. It’s a feeling.
- Authority isn’t data. “Expert X says…” needs to point to actual evidence. Experts can be wrong.
- Popularity isn’t data. “Everyone believes…” is not evidence. Everyone has been wrong before. Repeatedly.
The question isn’t “do I want to change my mind?” It’s “what does the data say?”
How to Actually Update
1. Notice the trigger
Something prompted you to reconsider. What was it?
- New data you hadn’t seen before?
- A study or measurement that contradicts your belief?
- An argument that exposes a flaw in your reasoning?
- The uncomfortable realization that your belief was based on assumptions, not evidence?
Name it clearly. This helps distinguish legitimate updates from social pressure. “I’m reconsidering because I saw compelling data” is different from “I’m reconsidering because people on Twitter were mean to me.”
2. Return to first principles
Before deciding whether to update, go back to basics:
- What’s the actual claim? State it precisely. Vague beliefs are impossible to evaluate.
- What evidence originally supported it? Was it strong? Or did you just absorb this belief from your environment?
- What’s the new evidence? Is it reliable, reproducible, falsifiable?
- Do the two actually conflict? Sometimes new data adds nuance rather than contradicting. A single counterexample doesn’t always destroy a pattern.
This is scientific thinking applied to your own beliefs. You’re not asking “how do I feel about this?” You’re asking “what does the evidence support?” Those are different questions with potentially different answers.
3. Assess the evidence quality
Not all evidence is equal. This seems obvious, but watch how often people treat a single anecdote with the same weight as a meta-analysis of fifty studies.
Apply the same standards you’d want in a scientific study:
- Source reliability: Who produced this? Do they have relevant expertise? Conflicts of interest?
- Methodology: How was this data gathered? Could it be biased? Sample selection matters.
- Reproducibility: Has anyone else found the same thing? One study is a data point, not a conclusion.
- Sample size: Is this based on robust data or a few dramatic examples?
- Falsifiability: Could this claim even be proven wrong? If not, it’s not a claim-it’s a belief system.
Here’s a useful test: if someone with the opposite belief showed you this exact evidence, would you find it convincing? If you’d dismiss it when it came from them but accept it because it confirms what you already think-congratulations, you’ve caught yourself rationalizing.
4. Quantify your confidence
Belief change doesn’t have to be all-or-nothing. Binary thinking is for computers. Think in probabilities:
- You can move from 90% confident to 70% confident
- You can move from “definitely true” to “probably true”
- You can move from “absolutely certain” to “I need more data”
Calibrated confidence means your certainty matches the strength of your evidence. Strong evidence = high confidence. Weak evidence = low confidence. No evidence = no opinion yet.
Small updates count. You don’t have to flip completely to the opposite view. Moving from “certain” to “pretty sure” is still progress.
5. Acknowledge it openly
This is the hard part for most people. Say it out loud:
“I used to think X, but now I think Y because Z.”
This feels exposing. But consider what it actually demonstrates:
- Honesty: You’re not pretending you always believed Y
- Intellectual rigor: Others can learn from your reasoning process
- Credibility: People trust those who can acknowledge error more than those who claim infallibility
6. Integrate the update
A changed belief has implications. What else changes as a result?
If you were wrong about this, what else might you be wrong about? Your beliefs form a web-they connect to each other. Pull one thread and others may need adjustment.
Updates often come in clusters. One correction reveals other corrections needed. This is uncomfortable but efficient: you’re debugging your worldview.
The Social Challenge
Social environments often punish mind-changing:
Politics: Changing positions is “flip-flopping.” Consistency is rewarded even when the consistent position is demonstrably wrong. Voters apparently prefer someone who’s been wrong for twenty years to someone who updated their view last month.
Relationships: “But you said X before!” People keep score of positions like they’re contractual obligations.
Work: Admitting you were wrong can feel like admitting incompetence. In some corporate cultures, it actually is-which says more about those cultures than about you.
Online: Your old takes are screenshot-able forever. The internet keeps receipts. You can be held accountable for things you said years ago, in different contexts, before you had information you now have.
There’s no easy solution. But some approaches help:
Explain your reasoning, not just your conclusion. “I changed my mind because I saw this data” commands more respect than unexplained reversal.
Frame it accurately: “I learned something new” rather than “I was an idiot.” You weren’t being foolish-you had incomplete information. Now you have more.
Be consistently open. If people know you’re always willing to update on evidence, changes become expected rather than surprising. This becomes part of your reputation.
Choose your battles. You don’t have to publicly announce every belief update. Some beliefs you can just… quietly hold differently.
Identity-Level Beliefs
The hardest beliefs to change are the ones fused to your identity.
If “I’m a skeptic” is core to how you see yourself, you might resist believing things that are actually well-supported-because accepting them would threaten your self-image.
If “I’m open-minded” is core to your identity, you might accept things you should scrutinize-because skepticism would contradict who you think you are.
If your political identity is central to who you are, changing any political view feels like betraying yourself. This is why political arguments are so unproductive: people aren’t defending positions, they’re defending themselves.
The deeper the belief, the harder the update. This explains why many people will change surface-level opinions while core beliefs remain locked in for life.
A solution: Make “I update on evidence” part of your identity. Then changing your mind becomes consistent with who you are rather than threatening to it. You’re not flip-flopping; you’re doing exactly what you said you’d do.
What Mind-Changing Looks Like
Let’s be clear about what this isn’t:
- Agreeing with whoever talked to you last
- Having no convictions
- Treating all positions as equally valid
- Changing beliefs based on feelings or social pressure
And what it is:
- Following the data wherever it leads, even when inconvenient
- Being willing to say “I was wrong-here’s the evidence that changed my mind”
- Tracking what changed your mind and why, so you can evaluate your own reasoning
- Holding beliefs with confidence proportional to evidence strength
- Applying the same standards to your own beliefs that you’d apply to others'
- Respecting people who change your mind rather than resenting them for the discomfort
A Final Thought
There’s a phrase: “Strong opinions, weakly held.”
This means: have clear views and act on them, but be ready to change when data warrants. Don’t confuse confidence with certainty. Don’t let strength of opinion substitute for quality of evidence.
The data-driven mindset:
- Your beliefs are hypotheses, not identities
- Evidence is what updates hypotheses
- Being wrong is information, not failure
- Uncertainty is honest, not weak
The goal isn’t to never be wrong-that’s impossible. The goal is to be less wrong over time. To follow the data. To update when evidence demands it. To get closer to truth through iteration rather than stubbornness.
Every time you change your mind based on evidence, you become slightly more calibrated, slightly more accurate, slightly better at navigating reality.
That’s not weakness. That’s how science works. That’s wisdom.
Series Conclusion
You now have the core tools:
- Understanding cognitive biases-why they exist and how they operate
- Testing assumptions with first principles and scientific thinking
- Moving beyond either/or thinking to see spectrums
- Recognizing emotional hijacks and waiting them out
- Seeking disconfirmation deliberately
- Being data-driven rather than opinion-driven
- Changing your mind when evidence warrants it
The thread through all of this: follow the data, not your feelings.
Your brain wants to protect existing beliefs, confirm what you already think, and avoid the discomfort of being wrong. That’s biology-it’s not a character flaw. But you can override it by consistently asking: “What does the evidence actually say?”
These aren’t one-time techniques. They’re habits to build over a lifetime. You’ll still fall for biases-everyone does, including people who study biases professionally. But you’ll catch yourself more often. You’ll make fewer predictable mistakes. You’ll think more clearly more of the time.
In a world full of people who believe things because they feel true, being data-driven is a genuine advantage. Not because you’re smarter-but because you’re willing to be wrong in pursuit of being right.
That’s the whole point.
Enjoyed This?
If this helped something click, subscribe to my YouTube channel. More content like this, same approach - making things stick without insulting your intelligence. It’s free, it helps more people find this stuff, and it tells me what’s worth making more of.