The Psychology of Decision-Making: Why We Make Irrational Choices
Humans are not rational actors who occasionally err. We are systematically, predictably irrational in ways that researchers have mapped in detail. Understanding these patterns is the beginning of making better decisions.
Admin
Author
Why do humans make irrational decisions?
The short answer: because the mental shortcuts that allowed our ancestors to make fast decisions in uncertain environments are systematically misapplied in modern contexts they were not designed for.
Kahneman and Tversky's prospect theory (1979) demonstrated that human decision-making departs from rational choice models in predictable, repeatable ways. Their work, and the behavioural economics it spawned, has documented over 180 distinct cognitive biases.
This is not a list of human flaws. It is a map of how the mind actually works — and the beginning of using that map to make better decisions.
What is loss aversion?
Loss aversion is the empirical finding that losses loom approximately twice as large psychologically as equivalent gains. Losing £100 produces roughly twice the psychological pain as gaining £100 produces pleasure.
Practical consequences:
- Investors hold losing positions too long (avoiding the pain of realising the loss) and sell winning positions too soon (cashing in before the gain disappears)
- People work harder to avoid losing something they have than to gain something of equivalent value they do not have
- Framing effects are powerful: "save £200" and "avoid paying £200 extra" are mathematically identical; the latter reliably produces higher compliance
Loss aversion is not irrational in evolutionary terms. In a resource-scarce environment, losses are genuinely more dangerous than equivalent gains are beneficial — you cannot survive ten times as many wins if one bad loss is lethal.
What is the availability heuristic?
The availability heuristic is the mental shortcut of judging the probability of an event by how easily examples come to mind.
People consistently overestimate the probability of dramatic, memorable causes of death (plane crashes, shark attacks, terrorism) and underestimate mundane ones (diabetes, heart disease, car accidents) because the dramatic events receive disproportionate media coverage and generate vivid mental images that are easy to recall.
This has significant implications for:
- Risk assessment: Security spending often correlates with availability (the last salient attack) rather than actual risk
- Hiring: Interviewers weight vivid anecdotes from candidates over systematic assessment data
- Medical diagnosis: Doctors anchor on recently seen cases — a radiologist who reviewed several melanomas last week is more likely to identify the next ambiguous lesion as melanoma
Corrective: make probability estimates before seeking examples. Anchoring to a prior estimate before searching memory reduces availability bias.
What is the sunk cost fallacy?
The sunk cost fallacy is the tendency to continue an investment of time, money, or effort based on what has already been spent rather than on the expected future value.
Rational decision-making looks only forward: the relevant question is whether continued investment generates positive expected returns from this point. Past expenditure is irretrievable — a sunk cost — and should not influence the decision.
But humans reliably weight past investment. We finish books we are not enjoying because we have read half of them. We stay in relationships or jobs or projects that are clearly net negative because of what we have already invested.
The corrective is a simple reframe: "If I were starting today with no prior investment, would I choose this?" If the answer is no, the sunk cost is distorting the decision.
What is choice overload, and how does it affect decisions?
Choice overload (the paradox of choice, as Barry Schwartz framed it) is the finding that beyond a certain number of options, adding more choices decreases both satisfaction with the eventual choice and the probability of making a choice at all.
The classic study (Iyengar and Lepper, 2000): a jam-tasting display offering 24 varieties attracted more visitors than a display offering 6 — but the 6-variety display generated 10 times more purchases.
Choice overload mechanisms:
- Decision fatigue: weighing more options depletes cognitive resources
- Opportunity cost anxiety: more options mean more imagined alternative futures foregone
- Post-choice regret: more options make it easier to imagine a better choice was available
Practical implication for product and UX design: fewer options with clearer differentiation consistently outperform large option sets.
What is anchoring bias?
Anchoring is the disproportionate influence of the first piece of numerical information encountered on subsequent estimates.
In salary negotiation: the first number stated in the negotiation is the anchor — even if it is arbitrary. The final number is predictably pulled toward the initial anchor.
In pricing: a "was £199, now £99" framing makes £99 feel inexpensive relative to the anchor, regardless of the actual value delivered.
In legal judgments: studies have found that the severity of sentences correlates with randomly generated numbers rolled on dice before sentencing decisions — the random number anchors the estimate.
Corrective: before receiving any numerical information, make your own independent estimate. Once your estimate is formed, anchoring has significantly less influence.
What is the confirmation bias?
Confirmation bias is the tendency to seek, interpret, and remember information that confirms existing beliefs, while discounting information that contradicts them.
It operates at three levels:
- Information search: we search for confirming evidence more readily than disconfirming
- Interpretation: ambiguous information is interpreted as confirming
- Memory: confirming evidence is remembered more accurately
Confirmation bias is perhaps the most pervasive cognitive bias because it operates largely unconsciously and is reinforced by social environments (people who share our views agree with us, reinforcing that our views are correct).
Correctives: actively seek the strongest version of the opposing argument (steel-manning, not straw-manning); consult people whose judgment you respect who hold different views; ask "what would change my mind?" before forming a view.
How do you make systematically better decisions?
No technique eliminates bias. The goal is to reduce its influence at the margins — which compounds over a career.
The practical toolkit:
- Pre-mortems: before a major decision, assume it fails and work backward. This activates the consideration of disconfirming evidence before commitment.
- Reference class forecasting: before estimating, find the base rate for similar decisions by others. Human beings are systematically overoptimistic about their own projects.
- Decision journaling: document the reasoning behind significant decisions at the time they are made. Review periodically. Pattern recognition over time is more valuable than any single corrective.
- 10/10/10: how will you feel about this decision in 10 minutes? 10 months? 10 years? Time-shifting perspective reduces the weight of present emotional states.