Most people are probably familiar with Cognitive Biases, the things that can wreak havoc on people, relationships, teams, organisations – really anything where people are involved. If you’re not, you may want to read Cognitive Science: An Introduction/Biases and Reasoning Heuristics. To summarise and refresh your memory, here’s the list from that article*:
- Framing: Viewing a need in the real world as a “problem” you can work on solving Mistaking your view of the problem for the real need.
- Anchoring & Adjustment: Assuming a starting point and thinking about adjustments from there
- Status Quo: “Business as Usual”, “If it ain’t broke, don’t fix it”
- Sunk Cost: Treating the resources already spent on one alternative as an estimate of the resources you’ll have to spend all over again to start a new one
- Confirmation: If you’re leaning towards an action, see if you can prove it’s a good one
- Cognitive Overconfidence: Decisiveness & Refuseal to be haunted by doubt
- Prudent Estimation: “Conservative Estimates”
- Risk Aversion: “A bird in the hand is worth two in the bush”. Avoid probability of ruin
- Selective Perception: Knowing what you’re looking for
- Recallability (’’availability’’): If an idea doesn’t fit in with the obvious data, it’s surely suspect
- Guessing at Patterns: Quickly spotting the trend or the big picture
- Representativeness: “If it looks like a duck and walks like a duck and quacks like a duck”
- Most likely Scenario: Avoids wasting time on possibilities that probably won’t happen
- Optimism: Go for the gold!
- Pessimism: Avoid unpleasant surprises
It’s pretty easy to see how we all, personally and organisationally can fall in to these traps at varying degrees and frequency. I’d add one more, which I think is often a risk on software:
- Least likely Scenario: Obsesses over a scenario which is highly unlikely and probably won’t matter anyway
which is a tricky one. Part of our job is to explore the realm of exceptions in IT as anyone can design a system that works for “the happy path”. The real skill is designing a system that is “robust” and can withstand unusual paths. This is not the same as exploring every scenario, no matter how unlikely. A robust system should be able to handle error paths that no-one has even considered. To me, that’s where real Architecture and Design come in, but that’s a whole other post.
Back to the Skramjet context, I think it’s obvious that you need to be aware of (and hopefully correct) your own cognitive biases, but I’d add that the team also needs to be aware of it’s and the organisations cognitive biases. As people, with sufficient motivation we can change quite quickly, for organisations this is more difficult because of the “momentum” and the “Status Quo” bias that seems to exist in most organisations, even more “progressive” ones. This is not an easy path and to tread and to do this we need some heroes, so dust off your cape for the next post ;-)
* with a bit og googling you can find even more comprehensive list! Personally, I think these are enough for most activities