Problems: Field Guide
A practical reference for finding problems actually worth solving
How to Use This Guide
This guide is organized by situations you'll actually hit while you're picking a problem to chase. When you're staring at a list of ideas, when a problem sounds good but you can't tell if it's real, when you're six weeks in and quietly panicking that you picked wrong, find the relevant section and work the heuristics before you keep moving.
Every heuristic here was built from two things: the lectures we'll walk through together, and the hard lessons of people who've picked problems and lived with the consequences. None of these are rules. They're tests you can run on your own thinking.
When You Might Open This Guide
- You have three problem ideas from homework and you have to pick one in 48 hours.
- A problem sounds amazing on paper but nobody you've talked to seems that worked up about it.
- You're drawn to a problem mostly because it sounds prestigious, or because an investor would like it.
- You're staring at a symptom (customers keep complaining about X) and you can't tell what the upstream cause is.
- You're about to do your first customer interview and you're worried you'll lead the witness.
- You're six weeks into a project and you're starting to suspect you picked the wrong problem but you don't want to throw the work away.
- Your team is about to vote on which idea to run with and someone keeps defaulting to "let's just build it and see."
When You Have a List of Ideas and Don't Know Which One to Chase
Most projects succeed or fail before you build anything. The expensive phase is the build. The cheap phase is the thinking. If you pick the wrong customer or the wrong pain, you can build the "right solution" to the wrong problem and still fail. Everything in this lecture is about upstream decisions, because that's where the leverage actually lives.
Source: Jason, Lecture 4 stream of consciousness
A high-quality problem has five things, not one. A real customer with real pain (not "people in general"), evidence of demand you can point at (workarounds, time, money already being spent), a clear success signal you can name out loud, a feasible first test you could run in one to three days, and a wedge you can actually win with. If your idea is missing two of these, it's not ready yet.
Source: Jason, Lecture 4 Prep Review
Write it in the forcing-function format or don't call it a problem. "[Customer] struggles to [job-to-be-done] because [constraint], resulting in [impact]." Students default to pitching solutions dressed up as problems. This sentence form makes that impossible. If you can't fill in every bracket with something specific, the problem isn't clear enough to commit to.
Source: Jason, Lecture 4 Prep Review
Pick the one you can actually test with a human this week. When you're torn between ideas, pick the one that feels most testable with real humans in the real world. Testability beats elegance. The problem that survives contact with an actual person, this week, at a coffee shop, is worth more than the problem that sounds beautiful in a pitch deck.
Source: Jason, Lecture 4 Prep Review
Go where your advantages actually mean something. Take inventory of what you can do that few people would match. The intersections of disciplines and expertise are the most valuable areas to mine. You don't need to be the best in the world at one thing; you need to be one of the few who sit at the intersection of two or three things that matter for this customer.
Source: Jason, Lecture 4 stream of consciousness
When a Problem Sounds Good but You Can't Tell if It's Real
If there are no stakes, it's not a high-quality problem yet. Pain without stakes is preference. Ask what happens to this customer if the problem never gets solved. Do they lose time, money, reputation, sleep, or standing? If the honest answer is "they'd shrug," you're looking at a nice-to-have, not a problem worth your semester.
Source: Jason, Lecture 4 Prep Review
Look for workarounds before you look for enthusiasm. People will say lots of encouraging things in an interview; they won't bother building workarounds unless the pain is real. Spreadsheets people maintain by hand, group chats full of screenshots, calendar reminders set to nag themselves. Workarounds are evidence. Enthusiasm is politeness.
Source: Jason, Lecture 4 Prep Review
Run the premortem: it's two months from now and we failed. Why? Sit down and assume the project died. Walk backwards from the failure. What warning signs did we ignore? What did we choose not to look at? This flushes out the assumptions you were protecting. A premortem costs nothing and catches the one thing you didn't want to notice.
Source: Jason, Lecture 4 stream of consciousness
Name what would prove you wrong, out loud, in one sentence. Your founding hypothesis has five parts, and the last one is the one students skip: we're wrong if ___. If you can't finish that sentence with something concrete and observable, you don't have a hypothesis. You have a wish. Circle the "wrong if" line on your poster board; it matters more than the rest combined.
Source: Jason, Lecture 4 Prep Review
Falsification is what separates science from pseudoscience. A belief that can't be wrong isn't a belief, it's a feeling. The same rule applies to problem statements. If no customer interview, no landing page test, and no pilot could ever disconfirm your problem, you're not holding a hypothesis; you're holding a hope.
Source: Ozan Varol, Think Like a Rocket Scientist ("Falsification is what separates science from pseudoscience," Ch. 6)
When You're Drawn to a Problem Because It Sounds Prestigious
Cargo cults are the warning, not the template. Copying the rituals of successful teams without understanding the purpose is how smart people waste entire semesters. If you're picking this problem because it sounds like what a startup would pick, or because the framework you just read picked something similar, stop. The process only works when you know why each step exists.
Source: Jason, Lecture 4 stream of consciousness
Watch out when the problem flatters you more than it serves a customer. Prestige problems have a tell: when you describe them, you spend more time on how hard and impressive the solution would be than on who is in pain. Flip the ratio. If you can't spend three minutes describing the customer's bad day without getting bored, the problem isn't yours yet.
Source: Jason, Lecture 4 stream of consciousness
Differentiation must be meaningful, not cosmetic. "We have a better UI" is not differentiation. Differentiation is solving it in a way that changes the tradeoffs the customer is living with. If your advantage disappears when a competitor spends a weekend on it, it was never an advantage. It was decoration.
Source: Jason, Lecture 4 Prep Review
Have the courage to move in the opposite direction. First-principles thinking starts by distrusting the received wisdom about what's interesting, what's hot, what's fundable. Break the problem down to what you actually know, then ask what the obvious answer is hiding. The prestigious problem is usually the one everyone else is already chasing poorly.
Source: Ozan Varol, Think Like a Rocket Scientist (notes on Ch. 2, "Reasoning from First Principles")
When You're Staring at a Symptom and Can't Find the Real Problem
The system you think you're in is almost never the system you're actually in. Most surprises aren't failures of effort. They're failures of the mental model you brought in. If the customer keeps doing something "irrational," your model of their world is the thing that's broken, not the customer. Update the model before you update the plan.
Source: Jason, Lecture 4 Red Teaming
Bottlenecks shape behavior more than effort does. If the customer is pushing on a constraint you can't see, working harder won't move the outcome. Find the bottleneck first. Ask what would have to be true for this pain to disappear entirely, and work backward from there. That's usually upstream of whatever they're complaining about.
Source: Jason, Lecture 4 Red Teaming
Leverage rarely looks like "work harder." Real leverage looks like changing a constraint, not speeding up a flow. If your proposed solution is "the same thing but faster," you probably haven't found the real problem. When you do find it, the solution often feels almost embarrassingly simple.
Source: Jason, Lecture 4 Red Teaming
The Route Tracker lesson: the real job was never what it looked like. I thought I was building a bus tracker. The actual job was "let me check quickly whether I need to leave the house right now." Customers will describe features. Your job is to hear the job-to-be-done underneath. "Faster horses" happens when you ask the wrong question and then faithfully answer it.
Source: Jason, Lecture 4 stream of consciousness
Stop chasing the first cause you see; look one layer down. The complaint is rarely the problem. The complaint is what the customer has language for. Ask what has to be true for the complaint to happen in the first place. Then ask again. The real problem is usually two layers down from the surface, and it's usually something the customer hasn't named yet.
Source: Jason, Lecture 4 stream of consciousness
When You're About to Talk to a Customer and Don't Want to Bias Them
Interview the pain, not the solution. You are not there to pitch. You are there to find out whether the pain is real, how often it hits, who it hits, and what they already do about it. If you leave the conversation and all you can remember is whether they liked your idea, you ran the wrong interview.
Source: Jason, Lecture 4 Prep Review
Go find them where the pain actually happens. The Route Tracker got real when I stopped polling friends and started standing at the UVU bus stop asking people getting off the bus what they'd pay. Pain is specific, and so are the places it happens. Don't interview customers in the abstract; interview them in context.
Source: Jason, Lecture 4 stream of consciousness
Ask about the last time it happened, not whether it happens. "Would you use a tool like this?" gets you lies. "Walk me through the last time this was a problem for you" gets you the truth. Past behavior is data; future intention is a vibe.
Source: Jason, Lecture 4 Prep Review
Cheap tests beat expensive opinions. You don't need twenty interviews. You need three good conversations, a rough artifact, and the nerve to watch someone use it without defending it. Concierge versions, landing pages, and manual pilots are not lesser forms of testing; they're the forms that let you be wrong cheaply.
Source: Jason, Lecture 4 Prep Review
If your first test requires a full build, you waited too long. The whole point of small loops is to keep failure recoverable. If the only way to learn whether your problem is real is to ship a finished product, you've skipped the part of the process that was supposed to protect you.
Source: Jason, Lecture 4 Prep Review
When the Problem Seems Too Big to Start
Pick the wedge, not the whole problem. A high-quality problem includes a narrow entry point you can actually win. You're not solving "public transit is hard in Utah County"; you're solving "riders at this one stop never know if the bus is coming." The wedge is how you get a real customer this week instead of a hypothetical customer next semester.
Source: Jason, Lecture 4 Prep Review
Always have something shippable. Incremental value, always a version that can go out the door, always feedback coming in. This doesn't mean ship fast and break things. It means never let yourself get to a place where the only way to learn anything is to spend another month.
Source: Jason, Lecture 4 stream of consciousness
Build the founding hypothesis, then iterate on it in tiny loops. The founding hypothesis is the opening bet. It's not the final answer; it's the first thing concrete enough to test. Once you have it, stop theorizing and start running small experiments on each part: customer, pain, alternative, differentiation, signal. Small loops until it clicks.
Source: Jason, Lecture 4 stream of consciousness
Only when you recognize the fragility in the system can you see what has to be protected. You don't need to solve everything. You need to find the one thing that, if it broke, would take the whole experience down. Start there. Most projects die from ignoring a fragility someone could have seen on day one.
Source: Ozan Varol, Think Like a Rocket Scientist (notes on Ch. 8)
When You Suspect You Picked the Wrong Problem
Speed without truth is just confident wrongness. AI will let you move faster than ever. Speed creates confidence. Confidence runs ahead of reality. That's why your testing discipline matters more with AI, not less. If you're moving fast and not learning anything disconfirming, you're not iterating; you're hallucinating.
Source: Jason, Lecture 4 Prep Review
Surprises are information, not failure. When the customer behaves in a way you didn't predict, that's not the project going wrong. That's the project telling you your model was incomplete. Treat every surprise as a free upgrade to your understanding of the system.
Source: Jason, Lecture 4 Red Teaming
Seek alternatives to your first idea; consider conflicting opinions before you commit. Lincoln built a team of rivals on purpose. You need the same thing in miniature. Before you commit to an approach, force yourself to write out the two strongest alternative framings and the two sharpest objections. If you can't steelman the competition, you're not ready to pick yet.
Source: Jason, Lecture 4 stream of consciousness
Pivoting is cheap when the loops were cheap. If you built small and tested early, pivoting six weeks in is a Tuesday. If you built big and tested nothing, pivoting six weeks in is a crisis. The question isn't whether to pivot; the question is whether your earlier self left you enough optionality to do it gracefully.
Source: Jason, Lecture 4 Prep Review
The Founding Hypothesis Template
Every problem you chase should collapse into a single testable sentence. Fill this in, circle the last line, and then go find out.
We believe [customer]
has the problem [pain / job-to-be-done]
and they will adopt it because [differentiation]
We'll know we're right if [measurable signal]
and we're wrong if [disconfirming evidence].
If any line is vague, the hypothesis is vague. If you can't finish the last line, you don't have a hypothesis yet.
Source: Jason, Lecture 4 Prep Review
The Five-Part Test for a High-Quality Problem
| Test | What to look for |
|---|---|
| Real customer with real pain | Someone specific, in a specific context, with stakes you can name |
| Evidence of demand | Workarounds, time spent, money already moving, complaints in the wild |
| Clear success signal | "We'll know it's better if ___" in one sentence |
| Feasible first test | Something you could run in 1 to 3 days, without writing production code |
| A wedge | A narrow entry point you can actually win, not the whole category |
If your problem is missing two or more, put it back and keep looking.
Source: Jason, Lecture 4 Prep Review
Go Deeper
These are the books behind the thinking in this guide. Read the ones that match where you're stuck.
| Book | Why It Matters |
|---|---|
| Think Like a Rocket Scientist by Ozan Varol | First-principles thinking, falsifiability, and the courage to move in the opposite direction when everyone else is pushing the same way. |
| Click by Jake Knapp & John Zeratsky | The founding-hypothesis idea and the foundation-sprint process. This is the book we're going to run against during the deep dives. |
| Sprint by Jake Knapp | Fast structured experiments that keep failure cheap. How to learn more in five days than most teams learn in five months. |
| Hidden Potential by Adam Grant | Practice, scaffolding, and how learning actually works when you can't shortcut your way through. Useful when a problem is hard and you're tempted to quit. |
| Good Economics for Hard Times by Banerjee & Duflo | How smart people get problems wrong at scale. Useful calibration for "is this actually what I think it is?" |
| Poor Economics by Banerjee & Duflo | A case study in why the obvious cause of a problem is usually not the real one, and why interventions built on the wrong model fail in predictable ways. |
| Thinking in Systems by Donella Meadows | Stocks, flows, feedback loops, and leverage points. The mental model behind "the system you think you're in is almost never the system you're actually in." |
| How Not to Be Wrong by Jordan Ellenberg | Base-rate reasoning for sizing up a problem honestly before you commit. Math that keeps you from confusing noise with a real signal. |
One Last Thing
"Your project will succeed or fail mostly based on decisions you make before you build: choosing the right customer, the right problem, and testing your assumptions early enough that failure is cheap."
That's the whole lecture in one sentence. If you remember nothing else from this guide, remember that the expensive part of the work is downstream of the decisions you're making right now, and the cheapest thing you can do today is run one more test on the problem itself.