We take the simple path because cognitive biases make costs invisible before we choose, and make outcomes feel predictable after we know them. This isn’t stupidity — it’s how human cognition works. Understanding the machinery helps us build systems that compensate.


Before the Decision: Why Simple Feels Right

BiasWhat It DoesHow It Affects Choice
WYSIATI (What You See Is All There Is)We decide based on available info, ignoring what we don’t knowHidden costs aren’t visible, so we ignore them
System 1 vs System 2Fast/intuitive vs slow/deliberate thinkingComplex path requires System 2, which is effortful — we avoid it
Planning fallacyWe underestimate time, costs, risks”The renovation will go fine” / “We’ll fix it later”
Availability heuristicWe judge likelihood by how easily examples come to mindContractor disasters are invisible until they happen to YOU
Hyperbolic discountingFuture pain feels less real than present effortResearch feels hard NOW; the repair is abstract

After the Consequences: Why We Judge Harshly

BiasWhat It DoesThe Distortion
Hindsight biasAfter knowing outcome, we think we could have predicted it”Obviously that contractor was bad” — but you didn’t know then

The textbook definition: “The tendency to believe falsely, after the outcome of an event is actually known, that one could have accurately predicted that outcome.”


The Reinforcing Loop

BEFORE THE DECISION:
├─ WYSIATI: Hidden costs aren't visible → we ignore them
├─ System 1: Complex path feels hard → we avoid it
├─ Planning fallacy: We underestimate what could go wrong
├─ Availability: Disasters aren't salient → we discount them
└─ Result: We take the simple path

AFTER THE CONSEQUENCES:
├─ Hindsight bias: "Obviously that was going to fail"
├─ We judge ourselves (or others) harshly
├─ We forget that the information wasn't available
└─ We learn the WRONG lesson: "I should have known"
   (instead of: "I should have prepared for uncertainty")

The Corrected Mental Model

Intuitive BeliefKahneman Correction
”I could have predicted that”No — hindsight bias. You’re retrofitting a narrative.
”Next time I’ll know”No — WYSIATI. Future unknowns will also be invisible.
”The careful path is too much work”That’s System 1 avoiding System 2. The work exists either way.
”It probably won’t happen to me”Availability bias. It happens to people exactly like you.

The Right Lesson

Hindsight bias makes us think the future was knowable. It wasn’t. The lesson isn’t “predict better.” The lesson is “prepare for not knowing.”

This connects to Economic Data Tells You Where You Were, Not Where You Are — the same irreducible uncertainty applies to personal decisions as to economic cycles.


Common Trap

The trap: After something goes wrong, concluding “I should have seen it coming” and believing you’ll spot it next time.

The fix: You won’t. The next hidden cost will also be hidden. Build systems that assume uncertainty: size downsides, prepare for costs to come due, don’t bet on being the exception.


North: Where this comes from

East: What opposes this?

South: Where this leads

West: What’s similar?