← All field guides

When Thinking Fails: Field Guide

A practical reference for the moments your own reasoning is the problem


How to Use This Guide

This guide is organized by the moments where your thinking is most likely to let you down, not by theory. When you catch yourself feeling certain, when a decision feels "obvious," when your team agrees too quickly, find the relevant section and slow down long enough to read through it.

Each section has a few principles, a short explanation, and a source if you want to go deeper. The goal isn't to prove you're right. It's to find out you're wrong before it gets expensive.


When You Might Open This Guide


When You're About To Act On A Conclusion That Feels Obvious

Confidence arrives before evidence does. You do not feel unsure first and then gather data until you become confident. Most of the time it runs the other way. You land on a conclusion fast, and then your brain goes looking for reasons to support it. The feeling of certainty is not proof that you're right. It's proof that your brain has finished working, which is a different thing entirely.

Source: Lecture 2 speaker notes (Jason Johnson)

Intelligence is not immunity. It's often the opposite. Smart people are really good at building coherent explanations for why they are right. That's the trap. The more tools you have for reasoning, the more convincing the story you can tell yourself. Being articulate about your position is not the same as being correct about it, and a well-constructed argument is exactly what motivated reasoning produces.

Source: Lecture 2 speaker notes (Jason Johnson); Pinker, Rationality

Ask how much confidence the thing actually deserves. The question isn't "do I have a reason to believe this." Of course you do, or you wouldn't believe it. The question is how much weight that reason can actually carry. If someone equally smart used the same reasoning and landed somewhere else, your confidence is resting on something other than the evidence.

Source: Lecture 2 speaker notes (Jason Johnson)


When You Need To Pressure-Test A Plan Before You Commit

Write down what would have to be true for this to be wrong. Popper's insight about science was that a good theory isn't one you can prove; it's one you could in principle disprove. The same applies to a business plan or a project bet. If nothing you could observe would change your mind, you don't have a plan, you have a belief. Decide in advance what counts as failure, and write it down somewhere a future version of you can find it.

Source: Philosopher's Toolkit (Baggini & Fosl)

Prefer cheap tests and reversible bets at the start. The cost of being wrong is not constant. It's small when you're still exploring and enormous once you've built infrastructure around a decision. So the early moves should be the ones you can throw away without grief. A two-day prototype that kills an idea is worth more than a two-month build that proves the same thing. This is especially true with AI, where plausible output shows up faster than verification does, and early assumptions get quietly encoded into systems that replicate them cleanly.

Source: Lecture 2 speaker notes (Jason Johnson)

When you're drawing conclusions from thin evidence, wiggle room is the enemy. If you go looking for a pattern, you will find one. The more ways you leave yourself to interpret the result, the more likely you are to find whatever you were hoping for. Lock down what you're measuring and what would count as a negative result before you run the test, not after.

Source: Ellenberg, How to Not Be Wrong


When Everyone In The Room Agrees Too Quickly

Unanimity is a warning sign, not a finish line. If a group converges fast and nobody has pushed back, the group has probably not actually examined the question yet. People agree for social reasons as often as for intellectual ones, and the price of dissent inside a team is usually higher than the price of being wrong together. Treat early agreement as a cue to slow down and ask what the strongest objection would sound like.

Source: Sagan, The Demon-Haunted World

You are biased about your biases. People consistently think they are less susceptible to bias than everyone else in the room, and they never think they carry more bias than average. The person most likely to miss their own blind spot is the one who feels most confident they've accounted for it. The humility fix isn't to doubt yourself more. It's to invite a specific person to disagree with you on purpose.

Source: Pinker, Rationality

Build teams that share a goal, not an opinion. The most useful group is one that doesn't agree on everything but is pointed at the same target. That kind of team catches each other's mistakes. A team that already agrees on the answer can only confirm it.

Source: Pinker, Rationality


When You Notice Yourself Wanting A Particular Answer

If you want it to be true, cut your confidence in half. Motivated reasoning is the polite name for the thing where you reach a conclusion first and then find reasons. The tell is simple: you feel pulled toward the answer. Upton Sinclair's version is cleaner than any textbook definition: it is difficult to get someone to understand something when their salary depends on not understanding it. Your salary doesn't have to be money. It can be your identity, your plan, the last six weeks of work, the thing you already told your boss.

Source: Pinker, Rationality (quoting Upton Sinclair)

Reversing the question is a fast diagnostic. People are measurably better reasoners when they want something to be false than when they want it to be true. So flip it. Instead of "what makes this true," ask "what would have to be true for me to walk away from this." If you can't generate a real answer, you're not evaluating the idea. You're defending it.

Source: Pinker, Rationality

Most people do not fail because they lack intelligence. They fail because their confidence felt justified too early. That's the whole shape of it. The error was not in the analysis. The error was that the analysis ran on assumptions that nobody had yet stress-tested, and by the time the world pushed back, the commitment was already made. The earlier that happens, the more expensive the fix.

Source: Lecture 2 speaker notes (Jason Johnson)


When You're Evaluating A Claim From An Expert (Including Yourself)

Francis Bacon's line, four hundred years old and still the best warning label on reasoning: people more readily believe things they want to be true. Credentials don't inoculate you against this. Neither does experience. A good analyst with a track record of calling things right can still be confidently wrong about the next thing, because the reasons they were right before may not apply this time, and their confidence is being paid forward by a reputation that has nothing to do with the current problem.

Source: Sagan, The Demon-Haunted World (quoting Francis Bacon)

Ask what would change this person's mind, not what they believe. A belief without a disconfirming condition is a posture. If the expert in front of you, or the voice in your own head, can't describe the observation that would make them drop the claim, the claim isn't operating as analysis. It's operating as identity. Treat it accordingly.

Source: Philosopher's Toolkit (Baggini & Fosl)

Confidence is not commitment. You are allowed to act on an incomplete picture. You do it every day, and you have to. But you're not required to pretend your picture is complete. Confident enough to move and humble enough to update is a real position; it's just uncomfortable, because it means you have to keep paying attention after the decision is made.

Source: Lecture 2 speaker notes (Jason Johnson)


When You Have All The Information And Still Got It Wrong

Most failures come from over-interpreting partial signals, not from missing data. When something goes wrong, the first instinct is to ask what you didn't know. The more honest question is usually what you had in front of you and misread. Go back to the evidence that was actually on the table at the time, not the evidence you wish you'd had, and ask whether you built a clean story out of messy fragments. That move is how most confident bad decisions get made.

Source: Lecture 2, slide 16 speaker notes (Jason Johnson)

Clean stories are almost always built on messy, incomplete evidence. If the narrative you're telling yourself has no loose ends, no exceptions, and no uncomfortable counterexamples, you probably ironed them out without noticing you were doing it. A story that explains everything explains nothing. The more satisfying the plot feels, the more suspicious you should be of how you got there.

Source: Lecture 2, slide 16 speaker notes (Jason Johnson)

The better postmortem question is "did we not know, or did we misread?" After a bad call, teams usually blame missing information because that's the comfortable version of the story; it implies nobody could have known. If you actually had the signal and flattened it into the story you wanted, that is a different problem and it compounds. Do the harder version of the postmortem and you will catch yourself doing it earlier the next time.

Source: Lecture 2, slide 16 (Jason Johnson)


Five Questions To Run Before A Decision Feels "Done"

Question What It's Testing
What is the main thing my confidence is resting on?Whether you have a reason or a feeling.
If someone equally smart used that same reason, could they reach a different conclusion?Whether the reason is load-bearing or decorative.
What would have to change for my confidence to drop by 10 points?Whether the belief is falsifiable at all.
Can I imagine a situation where this level of confidence would be wrong?Whether you've actually considered the downside.
If someone disagreed with me, what would I assume about their reasoning?Whether you're in analysis or in defense.

These are the prompts from the exercise at the start of Lecture 2. Keep them somewhere you'll see them the next time a decision feels obvious.


Go Deeper

These are the books this guide draws from, plus a few Jason has leaned on in class that don't get their own principle here.

Book Why It Matters
Rationality by Steven PinkerThe clearest modern walkthrough of why smart people believe wrong things. Motivated reasoning, myside bias, and the cognitive illusions chapter are the core.
The Demon-Haunted World by Carl SaganThe baloney detection kit is still the best short toolkit for evaluating claims, and the Francis Bacon line lives rent-free in Jason's head for a reason.
The Philosopher's Toolkit by Baggini & FoslIf you only learn one new idea from this lecture, make it falsifiability. This book is where it's best explained in plain language.
How to Not Be Wrong by Jordan EllenbergThe math-inflected companion to the other books here. The chapters on null hypothesis, wiggle room, and the Baltimore stockbroker are the ones that apply most directly.
Factfulness by Hans RoslingA reminder that the instincts we trust (fear, gap, size, urgency) are the ones that fail us on data questions. Pair with any time you catch yourself reacting to a headline.
The Coddling of the American Mind by Haidt & LukianoffThe cognitive distortions chapter is a field guide in its own right, and it names the distortions you are most likely to run into in group settings.
The Righteous Mind by Jonathan HaidtThe elephant-and-rider model, which is Jason's go-to explanation for why the rational part of you is usually the press secretary, not the president.