Cognitive Biases

Cognitive biases are systematic patterns of deviation from rational judgment. They're mental shortcuts that usually help but sometimes mislead. Understanding them helps you think more clearly and diagnose problems more accurately.

Why Biases Matter for Diagnosis

When investigating problems:

  • You will be biased in how you gather and interpret evidence
  • Stakeholders will be biased in what they tell you
  • The organization will have embedded biases in its culture and processes

Awareness doesn't eliminate bias, but it reduces its impact.

Key Biases in Problem-Solving

Confirmation Bias

What it is: Seeking evidence that confirms what you already believe and ignoring evidence that contradicts it.

How it shows up:

  • Interviewing people you expect to agree with you first
  • Dismissing data that doesn't fit your theory
  • Asking leading questions that prompt expected answers

Countermeasure: Actively seek disconfirming evidence. Ask "what would prove me wrong?" and then look for it.


Availability Bias

What it is: Overweighting information that comes to mind easily, usually because it's recent, vivid, or emotionally charged.

How it shows up:

  • Focusing on the dramatic failure, not the mundane causes
  • Giving too much weight to the last thing someone told you
  • Thinking rare events are common because they're memorable

Countermeasure: Seek systematic data, not anecdotes. Ask "what am I not seeing because it's not memorable?"


Anchoring

What it is: Over-relying on the first piece of information encountered, even if it's arbitrary.

How it shows up:

  • Estimating based on numbers that were mentioned, even irrelevant ones
  • Starting investigations where someone else pointed, without questioning why
  • Getting stuck on the first explanation offered

Countermeasure: Generate your own estimates before looking at others. Consider multiple starting points.


Fundamental Attribution Error

What it is: Attributing others' behavior to their character while attributing your own behavior to circumstances.

How it shows up:

  • "They made a mistake because they're careless" vs. "I made a mistake because I was rushed"
  • Blaming individuals when systems failed them
  • Ignoring context when judging performance

Countermeasure: Always ask about circumstances before concluding about character. "What about the situation made this behavior likely?"


Hindsight Bias

What it is: Believing, after an event, that you would have predicted it. "I knew it all along."

How it shows up:

  • Judging past decisions by outcomes rather than the information available at the time
  • Underestimating how uncertain the situation was
  • Assuming failures were predictable and therefore someone's fault

Countermeasure: Reconstruct what was known at the time. Judge decisions by process, not just outcomes.


Sunk Cost Fallacy

What it is: Continuing to invest in something because of what you've already invested, rather than evaluating future value.

How it shows up:

  • "We've already spent $2M, we can't stop now"
  • Continuing failed initiatives because abandoning them feels like waste
  • Escalating commitment to failing strategies

Countermeasure: Evaluate decisions based on future costs and benefits only. Past investment is irrelevant to future value.


Survivorship Bias

What it is: Drawing conclusions from successes without considering the failures that didn't survive to be observed.

How it shows up:

  • "These successful companies all did X, so X causes success"
  • Ignoring failed projects when analyzing what works
  • Learning only from visible examples

Countermeasure: Ask "what about the ones that didn't make it?" Seek out failure data.


Status Quo Bias

What it is: Preferring the current state of affairs over change, even when change would be beneficial.

How it shows up:

  • Resistance to new processes, tools, or structures
  • Framing change as risky and staying the same as safe
  • Requiring more evidence for change than for continuity

Countermeasure: Evaluate the current state as rigorously as you would a proposed change. What's the cost of not changing?


Dunning-Kruger Effect

What it is: People with low competence overestimate their ability, while experts underestimate theirs.

How it shows up:

  • Confident but wrong stakeholders
  • Experts who hedge too much
  • Resistance to outside perspective

Countermeasure: Calibrate confidence against track record. Seek feedback from people with different expertise levels.

Organizational Biases

Beyond individual biases, organizations develop collective biases:

Groupthink

Agreement driven by social pressure rather than evidence. Everyone nods, but nobody actually believes it.

NIH (Not Invented Here)

Dismissing external ideas because they weren't developed internally.

Success Theater

Reporting what's going well while hiding what's failing.

Tyranny of Metrics

Optimizing for measurable targets at the expense of unmeasured (but important) outcomes.

Using Bias Awareness

In Yourself

  • Before concluding, ask: "What bias might be affecting my judgment here?"
  • Seek evidence that would prove you wrong
  • Get outside perspectives

In Stakeholders

  • Recognize that what they tell you is filtered through their biases
  • Ask questions that bypass common biases
  • Triangulate across multiple sources

In Organizations

  • Identify systemic biases in culture and process
  • Design countermeasures into processes
  • Create safe spaces for dissent

Bias is unavoidable. Unawareness of bias is not.