| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

View
 

Irrationality: the Enemy Within

Page history last edited by Martin Poulter 15 years, 8 months ago

Notes on Stuart Sutherland (1994) Irrationality: the Enemy Within. Penguin paperback (ISBN: 0140167269), reissued 2007 as "Irrationality" by Pinter and Martin (ISBN: 978-1-905177-07-3)

 

Up: Books for Non-Specialists

 

In terms of making a great breadth of research easy to understand with a variety of examples and applications, this is still the best non-technical introduction to the topic of cognitive biases. There are books covering newer research (such as Cordelia Fine's A Mind of Its Own) but none is as ambitious and accessible as this.

 

It looks at judgement and decision-making in many contexts, from military tactics to medicine and from job interviews to the paranormal. It relies heavily on scientific research -  there are many elegant descriptions of psychological experiments - yet the technical details are kept out of the main text to make it easy for a non-specialist reader, and much of the text deals with real-life instances of bias. 324 footnotes point to the book's scientific basis.

 

If it has a flaw, it's on the philosophical side rather than the psychological side. In order to judge that some behaviour is irrational, you need a clear standard of rationality and an understanding of why that is the best choice. Sutherland is good at succinctly explaining the normative standards, but some of his examples are not as clear-cut as he makes them out.

 

Outline

 

1. Introduction

 

2. The Wrong Impression

  • Availability heuristic
  • Primacy error: Asch experiment in which rearranging the order of words in a description affects the impression it makes on subjects
  • Halo effect/ devil effect
  • Halo effect bias in peer review

 

3. Obedience

  •  Milgram obedience experiment

 

4. Conformity

  • Asch conformity experiment 
  • Effect of verbal commitment on future action: consistency effect
  • Credibility of experts: halo effect
  • Crowd behaviours: panic, violence, contagious emotion (including religious conversion)
  • Bystander effect (conforming to others' inaction)

 

5. In-groups and Out-groups

  • Drift of attitudes while staying in a group is not towards the centre but towards the extreme of the group's distinctive attitude
  • Risky shift: (group judgement is more risky than judgement of individuals)
  • Groupthink (Janis): "illusion of invulnerability coupled with extreme optimism"; stereotyped thinking about people outside group; suppression of doubt, dissent and unfavourable facts
  • Effect of uniform worn by subject on aggression
  • Sherif research on conflict between arbitrarily created groups
  • Stereotype bias in memory

 

6. Organisational Folly

  • Leslie Chapman's book Your Disobedient Servant on inefficiency in the British Civil Service
  • Examples of bias towards inefficiency in large organisations
  • The "fat cat" phenomenon when directors decide their own salaries
  • Contrarian investment strategy: advisers on equities routinely underperform the market

 

7. Misplaced Consistency

  • self-justificatory bias/"bolstering a decision": chosen item (e.g. bought house) seems much more desirable once it is chosen
  • Distortions of attitude in relationships
  • Escalation of commitment ("foot in the door")
  • justification of effort
  • Sunk Cost fallacy
  • Cognitive dissonance: Festinger and Carlsmith

 

8. Misuse of Rewards and Punishments

  • Intrinsic motivation decreased by excessive extrinsic motivation: "people who are trying to gain a prize will do less imaginative and less flexible work than those of equal talent who are not. In addition they may come to work less hard after winning the prize."
  • Punishment of children making them less obedient
  • Langer experiment on choice: a chosen lottery ticket is much more valuable to subjects than a randomly allocated lottery ticket. Related experiments with children, students, medical patients
  • "If you are a manager, adopt as participatory and egalitarian a style as possible."

9. Drive and Emotion

  • Reward improves performance on simple tasks; worsens performance on difficult tasks; effects of reward persist afterwards
  • Wishful thinking and self-serving biases; subjects change behaviour in direction consonant with (they are told) having a healthy heart
  • Stress worsening memory recall
  • fear; boredom; love

 

10. Ignoring the Evidence

  • Military example: Pearl Harbour
  • Wason rule-discovery task (confirmation bias) (people don't try to prove their hypotheses false)
  • Wason conditional task (A D 3 7)

 

11. Distorting the Evidence

  • Military example: battle of Arnhem
  • Lord, Ross and Lepper (1979) "Biased assimilation and attitude polarisation": Subjects read ambiguous information about death penalty; evaluate the information that supports their existing beliefs as "more convincing" and "better conducted": reading the same evidence strengthens beliefs both for and against
  • Resistance of beliefs to contrary evidence: subjects given random feedback on a meaningless task persist in believing they are good or bad at the task after thorough debriefing
  • Snyder and Cantor (1979) Subects given written description of a person; two days later decide if that person is suitable for a particular job: selectively recall parts of the description that support their decision
  • Reasons for persistence of belief:
    1. "People consistenty avoid exposing themselves to evidence that might disprove their beliefs."
    2. "On receiving evidence against their beliefs, they often refuse to believe it."
    3. "The existence of a belief distorts people's interpretation of new evidence in such a way as to make it consistent with the belief."
    4. "People selectively remember items that are in line with their beliefs." (see Seven Sins of Memory)
    5. Belief is shaped by "the desire to protect one's self-esteem."
    6. If the explanation the subject comes up with for the belief is satisfying, then the belief itself perseveres.

 

12. Making the Wrong Connections

  • The need to use all four cells in judging correlation
  • Chapman experiments on illusory correlation
  • Perceived usefulness of the Rorschach test and graphology as being due to illusory correlation
  • As well as illusory correlation there is correlation blindness: People "see" correlations that fit their prejudices, and fail to spot correlations (even 100% correlations) that don't match their prejudices
  • Illusory correlation of the "odd one out"; whether a word or a person
  • Hamilton and Gifford (1976) experiment on illusory correlation between minority groups and minority (deviant) behaviours

 

13. Mistaken Connections in Medicine

  • Confusion between direct and reverse probabilities (eg probability of having breast cancer given a positive mammography test, versus the probability of having a positive test given breast cancer)
  • Eddy's demonstration (in Judgment Under Uncertainty: Heuristics and Biases) that this sort of error is pervasive in medicine; even in medical research journals
  • Overconfidence bias and misjudgment of the risks of operations

 

14. Mistaking the Cause

  • Fallacy of "like causes like", e.g. in homeopathy (and in mainstream medicine before 20th Century)
  • Examples of mistaken causal reasoning from medicine and therapy, incl. myth that cholestrol-laden food makes heart attacks more likely; craze for tonsilectomies even when doctors' judgements of which tonsils needed removal were seemingly random
  • Fundamental attribution error: ignoring the effect of a situation on a person's behaviour
  • Effect of physical viewpoint on interpretation of behaviour
  • Illusory similarity of others' choices to our own
  • Illusory introspection: subjects poor at judging what factors affect their mood, or their responses to situations

 

15. Misinterpreting the Evidence

  • Representativeness (e.g. Judgements of which sequences are "random")
  • Conjunction fallacy (feminist bank teller)
  • Confusion between direct and reverse probabilities (continued); base rate ignorance
  • Statistical fallacies, e.g. ignoring the effect of sample size

 

16. Inconsistent Decisions and Bad Bets

  • Framing effects on betting behaviour (people accept or reject the same bet, depending on how it is described); bias towards certainty
  • Asymmetric assessment of gains and losses
  • Intransitive preferences from ignoring small differences in favour of large differences
  • Different answers to which bets are preferred and which bets are worth more
  • Economic behaviour: treating a £5 saving on a £20 item differently from on a £200 item
  • Loftus research on distortion of memory by post-event information
  • Anchoring heuristic

 

17. Overconfidence

  • Hindsight bias: Fischoff's original experiments
  • Superiority bias: 95 of British drivers think they are above average. Financial traders doing on average worse than the market but each believing their potential is better
  • Overconfidence in one's own judgments, including professional judgments
  • Illusion of Control

 

18. Risks

  • Three Mile Island and other industrial disasters
  • Waagenaar experiments on ignorance of warnings
  • Nuclear versus fossil fuels: biased assessment of advantages due to association, availability, halo effect

 

19. False Inferences

  • Satisficing as a decision procedure
  • Regression to the mean (e.g. "hot hand" fallacy)
  • Failure to distinguish perfect and imperfect predictors (predicting other students' exam scores)
  • Subjects have greater confidence in prediction measures when those measures are extreme
  • Though a pair of highly correlated tests are worse predictors of academic success than a pair of uncorrelated tests, subjects treat them as more informative (most of this chapter is based on Kahneman and Tversky (1973) "On the Psychology of Prediction")
  • Gambler's fallacy

 

20. The Failure of Intuition

  • Superiority of actuarial (i.e. algorithmic, in particular using multiple regression analysis) prediction over intuitive prediction in many different contexts
    • "out of more than a hundred studies comparing the accuracy of actuarial and intuitive prediction, in not one instance have people done better, though occasionally there has been no difference between the two methods. In the great majority of cases, the actuarial method has been more successful by a considerable margin"
  • Why is intuitive prediction so bad?
    • People make mistaken connections
    • People combine multiple pieces of information in the wrong way (see "False Inferences)
    • Inconsistency: a person's judgements can vary greatly from day to day
  • Resistance to actuarial techniques due to illusory superiority (e.g. overconfidence) of intuition
  • Use of expert systems in medical and commerce decisions (e.g. bank loans)
  • The irrationality of "head-hunters"

 

21. Utility

  • The rational standard: Expected utility and the diminishing marginal utility of money
  • Limitations of utility theory: multidimensional problems; difficulty of knowing how happy one will feel in a particular outcome; loss aversion
  • Irrational outcomes of purely financial cost-benefit analysis
  • Medical decision-making and Quality Adjusted Life years (QALYs)

 

22. The Paranormal

Biases that promote belief in the paranormal:

  • Reluctance to suspend judgement
  • Animism (tendency to interpret events in terms of intention and agency)
  • Wishful thinking
  • Availability: paranormal stories make the news; null results and retractions don't
  • Conformity: people to an extent see what they are told to see
  • Distorted assessment of evidence/confirmation bias, e.g. Forer effect/Barnum effect in evaluation of horoscopes
  • Cognitive dissonance: having paid money for a tarot reading or psychic treatment, people are under pressure not to admit that it is bogus.
  • Illusory correlation, e.g. between dreaming about an incident and it happening in real life the next day
  • Innumeracy: people are just wrong about the probabilities of events happening by coincidence
  • Fallible memory can be distorted so that coincidences seem more significant
  • Blindness to sample size: people are persuaded by a single case when a large sample shows no evidence
  • Overconfidence in belief: proponents not taking seriously the possibility they are wrong

 

23. Causes, Cures and Costs

Five basic causes of irrationality

  • Evolutionary value of group conformity, strong emotion
  • Biological randomisation and hypersensitivity in neural connections
  • Heuristics
  • Ignorance of elementary probability and statistics
  • Self-serving biases (i.e. enhancing self-esteem)
Education in statistical concepts, in economics or in psychology seems to counter some biases.

Comments (0)

You don't have permission to comment on this page.