We take the simple path because cognitive biases make costs invisible before we choose, and make outcomes feel predictable after we know them. This isn’t stupidity — it’s how human cognition works. Understanding the machinery helps us build systems that compensate.
Before the Decision: Why Simple Feels Right
| Bias | What It Does | How It Affects Choice |
|---|---|---|
| WYSIATI (What You See Is All There Is) | We decide based on available info, ignoring what we don’t know | Hidden costs aren’t visible, so we ignore them |
| System 1 vs System 2 | Fast/intuitive vs slow/deliberate thinking | Complex path requires System 2, which is effortful — we avoid it |
| Planning fallacy | We underestimate time, costs, risks | ”The renovation will go fine” / “We’ll fix it later” |
| Availability heuristic | We judge likelihood by how easily examples come to mind | Contractor disasters are invisible until they happen to YOU |
| Hyperbolic discounting | Future pain feels less real than present effort | Research feels hard NOW; the repair is abstract |
After the Consequences: Why We Judge Harshly
| Bias | What It Does | The Distortion |
|---|---|---|
| Hindsight bias | After knowing outcome, we think we could have predicted it | ”Obviously that contractor was bad” — but you didn’t know then |
The textbook definition: “The tendency to believe falsely, after the outcome of an event is actually known, that one could have accurately predicted that outcome.”
The Reinforcing Loop
BEFORE THE DECISION:
├─ WYSIATI: Hidden costs aren't visible → we ignore them
├─ System 1: Complex path feels hard → we avoid it
├─ Planning fallacy: We underestimate what could go wrong
├─ Availability: Disasters aren't salient → we discount them
└─ Result: We take the simple path
AFTER THE CONSEQUENCES:
├─ Hindsight bias: "Obviously that was going to fail"
├─ We judge ourselves (or others) harshly
├─ We forget that the information wasn't available
└─ We learn the WRONG lesson: "I should have known"
(instead of: "I should have prepared for uncertainty")
The Corrected Mental Model
| Intuitive Belief | Kahneman Correction |
|---|---|
| ”I could have predicted that” | No — hindsight bias. You’re retrofitting a narrative. |
| ”Next time I’ll know” | No — WYSIATI. Future unknowns will also be invisible. |
| ”The careful path is too much work” | That’s System 1 avoiding System 2. The work exists either way. |
| ”It probably won’t happen to me” | Availability bias. It happens to people exactly like you. |
The Right Lesson
Hindsight bias makes us think the future was knowable. It wasn’t. The lesson isn’t “predict better.” The lesson is “prepare for not knowing.”
This connects to Economic Data Tells You Where You Were, Not Where You Are — the same irreducible uncertainty applies to personal decisions as to economic cycles.
Common Trap
The trap: After something goes wrong, concluding “I should have seen it coming” and believing you’ll spot it next time.
The fix: You won’t. The next hidden cost will also be hidden. Build systems that assume uncertainty: size downsides, prepare for costs to come due, don’t bet on being the exception.
North: Where this comes from
- Thinking Fast and Slow (Kahneman’s research)
- Bounded Rationality (Herbert Simon’s original concept)
East: What opposes this?
- Expertise Reduces Some Biases (domain experts have better calibration — sometimes)
South: Where this leads
- Simplicity Moves Cost, It Doesn’t Reduce It (the mechanical consequence)
- The People Around You Bear the Cost of Your Shortcuts (the ethical consequence)
West: What’s similar?
- Illusion of Competence (feeling like you know vs actually knowing)
- Confirmation Bias (another way we filter information)