The Hard Work of Belief
Alice laughed.
Alice laughed. 'There's no use trying,' she said. 'One can't believe impossible things.'
‘I daresay you haven't had much practice,' said the Queen. 'When I was your age, I always did it for half-an-hour a day. Why, sometimes I've believed as many as six impossible things before breakfast. There goes the shawl again!’
― Lewis Carroll
We might laugh at the White Queen, but let me ask you: What do you believe? What do you really believe? What does it mean to believe something? What’s your most important belief? How did you come to believe that? Are you sure it’s true? What could get you to change that belief?
I’m not just asking questions like an annoying six-year-old, though these are the kinds of questions that can give us a headache. The ideas behind these questions give us the tools to approach the throne of Truth, justified, and in full awareness of the complexities. I’ve also seen what happens when this process goes wrong, and I’ll share that story at the end.
First, let’s spend some time getting this right.
We might wish that believing something was simply a matter of testing whether it’s true or not. Yes, there’s that aspect, but there’s much more. It’s more realistic to say something like this: Belief is a truth-seeking utility function, tuned by social factors. Hopefully, beliefs move us toward truth, but it’s not a clear cut, binary decision. Belief is not just about ourselves—it’s also about the environment we are in and those around us.
In the simplest cases, our beliefs nicely align with observable reality and experience: “this cup of coffee is too hot to drink right now.” Test it and find out. You have a litany of these kinds of beliefs which you have never thought about or questioned deeply. “I believe that driver will stop at the stop sign.” You don’t have to figure that out anew at each stop sign.
It’s good that we do not have to constantly reexamine all our beliefs. Many beliefs run as heuristics—rules of thumb that shape behavior in (mostly) useful ways and that don’t require active thought. They save time, reduce reaction time, and save cognitive load. Overthinking would paralyze us.
We can lean on those shortcuts. Though that cup of coffee is currently too hot, I believe the second law of thermodynamics is in effect today, so it will be cool enough to drink soon. Because I have prior experience with cups of coffee, I can even make some smart guesses about how long that might take.
Words, words, words.
Language probably makes us what we are, but words are both useful tools and absolute prisons. If we’re going to think deeper, we need to confront some words. Think about: belief, truth, and knowledge. Are they precisely the same? Not quite. We’ve covered truth. But do you only believe something because it is true? If so, what do you mean by “true”? And, of course, we would never think anything is true because we believe it.
My cup of coffee being too hot is a good example of a Justified True Belief (JTB), but even this apparently simple idea can go badly wrong. A philosopher named Gettier turned the idea of JTB on its head in the 1960s and we now talk about the JTB problem! Sometimes we might be right, have supporting evidence, but be right for the wrong reasons.
Imagine that you see someone who looks like your father shoveling snow. Naturally, you believe your father is shoveling snow. However, the neighbor just happens to be wearing the same coat your father wears and you are mistaking the neighbor for your father. However, in a twist on a twist, your father actually is shoveling snow, just out of sight.
So you were right, but only by chance—your evidence is not connected to the truth. Being right is sometimes just luck, even with justification.
Some beliefs are better than others
Of course, the question that ultimately matters is “should I believe this thing?” From a practical perspective, matching beliefs to experience and evidence is probably a good place to start. Sometimes this is simple, but let’s sharpen the sword of rational, logical thought.
The idea of falsification—that we must be able to define what would make us think the belief is not true—is useful in many frameworks. Karl Popper’s idea has shaped much of the scientific method. In your everyday life, try tuning your thinking this way. We naturally seek evidence that supports our ideas, that tells us we are right. But the risk of this is that we miss things that could show us we are wrong—that we live in a self-constructed echo chamber. Pay attention. Even just making room for contradiction will make you a better thinker.
Bayesian updating is a process where we start with a (prior) belief and then update it as new evidence comes in. Seems simple to write, but this is one of the towering ideas of the last five hundred years of human thought. I would caution that you’re probably not naturally as good at this as you might assume, so some attention here can make us better thinkers. Again, for certain kinds of beliefs, this is one of the best ways to make sure our beliefs make sense.
Why do you believe?
These are tools, and good tools. Nothing works in all cases, but these tools give us the best answers for certain types of beliefs. But I find another question even more interesting: where do our beliefs come from? The short answer is that there are many answers, but we can outline a few important paths.
Beliefs may come from experience—structured or unstructured. If unstructured, our rational minds will create the structure and sort like with like, helping us distill belief from raw experience.
I hinted at the social aspect of belief. This can be a tiny influence, or overwhelmingly powerful. In some cases, someone else can, in fact, tell us what to believe. We can accept expert opinion or the testimony of a witness who saw something. We may also accept the authority of a crowd, if we find they all believe the same thing. Surely if a large group of people believes pineapple is delicious on pizza they cannot be wrong. (They are. Tragically.)
We might also have an intuition, a feeling, or even a revelation that leads us to a belief. And it’s very likely some of our beliefs may come out of other beliefs. We can build entire belief systems from interconnected beliefs. (Be mindful of the foundation.)
In most cases, there’s an element of pattern recognition and pattern matching—our brains are very good at this, but they also make some astounding errors. Some beliefs may spiral out of narrative or creative thinking, completely disconnected from concrete experience.
Are you getting some idea how complex, nuanced, and interwoven this all is? It’s not nearly as simple as “I believe it because it’s true.”
Language shapes what we are even capable of thinking about, and emotions may create illogical weights to some of our beliefs. In truth, we’ve just scratched the surface, but we’re already pretty far from that simple, now cold, cup of coffee.
Do we choose to believe?
One thing the White Queen might not have been clear about: she probably discarded many of those “impossible beliefs” right away. We do this too, and naturally—simply testing a belief against reality may quickly show us that we cannot, in fact, bench press 300 pounds (and that 300 pounds will bench press us). Given enough contrary evidence, it would be silly to hold a belief, but, as they say, it’s your funeral.
But how much evidence is enough? That depends on the belief. If I believe that all swans are white, seeing a single black swan would show that belief to be false. The belief has been falsified. But then perhaps I could refine that belief—almost all swans are white. Now we better get started counting swans and figure out what, exactly, “almost all” means.
Simple falsification is a nuclear bomb for some belief systems. If, for instance, we could show a single case of information transfer at a distance—say, a mother instantly knows her child died many hundreds of miles away with no electronic devices involved—we’d have trouble with the current materialist paradigm of the universe. That should not be possible, and, if it happens, something is wrong with everything.
But, of course, it’s rarely that simple. Some beliefs defy efforts to test them. The tools of truth testing may be powerless or inappropriate in some domains: could we falsify moral beliefs, for instance? What about cases where truth itself might be fuzzy? And if we do falsify a dangerously false belief, you better burn that stump of a neck before new heads sprout from it. Beliefs are like that.
How to get it completely wrong
Traditionally, authors stand on a pedestal and tell you the Right Way to be, think, act, and what to do. I’m not traditional. I’m going to tell you some of the dumbest, silliest things you can do and show you how to make some catastrophic mistakes. Let’s look at some really bad ideas. They aren’t completely useless, but they are dangerous, can be highly misleading, and can even be fatal.
I defined belief as a utility function—something that seeks usefulness. Yes, but don’t hold on to a belief just because it is useful. It’s not too hard to imagine lies that make people feel good, complacent, or compliant. Sometimes we tell ourselves these lies, but people in authority are fond of them too. Yes, beliefs can be useful, even if wrong. Useful? To whom? Being useful is not enough reason to hold on to a belief.
Once you discover you have a bad belief, you’re likely to be confronted with the cost of changing it. Yes, it’s not free—there is a cost. Maybe it’s social: we’ve argued a point so long our identity is tied up in it. Other people in our group believe the thing and I like being in that group. Maybe believing that thing made me feel special in some way and I don’t want to not be special. People can hold on to a belief, knowing full well it’s wrong, because it’s hard, painful, or otherwise expensive to change. Integrity abhors laziness.
Be careful of coherence. Good beliefs do tend to be coherent because truth tends to be coherent. But, as a test, it’s not enough. Many wacky theories are fully coherent. Some theories grow around a seed of a story, and pieces are added to that story that line up with existing parts and extend them. So, naturally, they are coherent. A good story is coherent, even if it’s bullshit.
Some conspiracy theories even make a lot of sense if you can ignore the fact they make no sense at all. You don’t have to be stupid to believe conspiracy theories (though, if you want to believe many of them, it probably helps). Don’t believe something just because it fits in with other things you already believe.
Don’t solely trust intuition or revelation. Even if an angel appeared from a rose bush and spoke truth to you, remember there are trickster gods. You don’t have to read many stories to realize the truth of “when Man is desperate, he prays. When the gods are desperate, they lie.” And they love to play games—so don’t blindly trust a spiritual revelation even if it seems you should. If you have an intuition or even a spiritual revelation that aligns strongly with one of your convictions, be mindful of the dangers of confirmation bias. We’re really good at having epiphanies that reinforce existing beliefs!
At the risk of slightly repeating myself: Don’t believe something just because people around you believe it. Most people are of average intelligence, and most groups exhibit intelligence that is somehow below the average of its members. Of course, it’s possible that a group belief is fully correct, but it’s a warning sign to be careful. There’s a reason the term “groupthink” is never used in a positive sense.
Don’t give up
And, this brings us to some kind of conclusion. The bottom line is that belief is not simple. Even understanding our beliefs takes hard work and demands an unflinching look at our own minds. This is not easy to do.
A belief is not a simple weighing of evidence that leads, like an inevitable Rube Goldberg machine, to a final conclusion. It’s much more complicated, nuanced, and reflects all of the power, beauty, and flaws of the rational mind.
The seed for this piece was planted about twenty-five years ago. I had a rare conversation. It’s not often a single conversation irrevocably changes your opinion of a friend, but this one did. Seated beside my kitchen table, a friend told me that she “just wanted to find a fundamentalist church that would tell [her] what to believe. Make it simple. Make it black and white because gray is too hard. It’s too much trouble to figure it out.”
I thought, for a long time, why that conversation devastated me, and my opinion of her thought process. On one hand, I knew my friend was lazy. But, to me, this represented a betrayal of everything I held important. I could not imagine being capable of simply believing something because some group, somewhere, pretended to believe it. That lesson has never been far from my mind.
And that’s the crux of it. I don’t believe (ahem) that we choose what to believe. Neither do most philosophers, who would argue the concept of doxastic involuntarism—that our beliefs are not under our conscious control. With apologies to the White Queen, you and I cannot choose to believe things that one knows are not true: That the moon is made of blue cheese. That the moles in my yard have a secret nuclear base underground. That the earth is flat. Belief is not an act of will, and not under our conscious control. We cannot, with integrity, simply choose to believe.
But what we can choose is to do the hard work of examining our beliefs and the processes that uphold them. We can choose to face evidence, sit with the discomfort of uncertainty, hold our conclusions with integrity, and realize that integrity demands we often admit we do not and we cannot know.
But, as I said, none of this is simple. There are things I believe to be true, which might appear quite absurd to you, at first glance. There are things I believe to be true, for good reason, that might challenge many of your most dearly-held beliefs. And that’s where we’re going here.
Better beliefs help us understand the world better, but they do much more. Better beliefs make us better people.
It’s worth doing the hard work.
Note: This has been a necessary step in the First Fire project. It, along with a few of the posts that came before it, have leaned academic in tone. I’ve made an effort to be precise. Doing so has helped me refine and sharpen my thinking—though a bit painful (and, I hope, not so painful to read), working against the grindstone does sharpen the steel. I’ll refer back to some of this work in the future. And we now have a decent foundation.



An explorer searched for a legendary valley where time moved differently. Everyone knew it existed - travelers returned changed, speaking in riddles, unable to explain.
The old maps all contradicted each other. One showed the valley north, another south. Some said it moved.
So she developed a method: mark not just where paths led, but why she believed they led there. Note when certainty felt solid, when it felt hollow. Track which beliefs came from others' stories, which from her own steps.
Years passed. Her map became a living thing - full of crossed-out certainties, question marks, notes like "believed this was east until stars proved otherwise" and "six travelers swore this, but the rocks say different."
Other seekers found her annotations maddening. "Just tell us which way!" they begged. Some gave up, joined villages that promised the valley was just beyond their walls, if only you believed hard enough.
One dawn, she woke to find dew rolling upward in spirals. Her careful notes said this was impossible. Every belief she'd mapped said water doesn't move this way.
She watched the impossible dew, then added one more note to her map: "Here, certainty ends."
She smiled. Packed the map without consulting it.
Started walking toward what shouldn't exist.
Behind her, travelers still argued about whose beliefs about the valley were true.
They didn't notice she'd stopped believing anything.
At last, she was just walking, free again.
When scientists test an idea, they usually start by assuming the opposite of what they want to prove — the null hypothesis. If I believe a new drug works, the null hypothesis is “this drug has no effect.” The burden is on me to gather enough evidence to reject that null.
I find this incredibly helpful for belief formation. Instead of just looking for evidence that I’m right, I try to hold the null in my mind: what if I’m wrong? What would the world look like if this belief were false? What evidence would I expect to see? This stance shifts me from seeking confirmation to actively hunting for disconfirmation.
In practice, this doesn’t mean doubting everything all the time — it means giving my beliefs a chance to fail. If they survive contact with the null, they come out stronger and more justified. If they don’t, I’ve just saved myself from building on a bad foundation.