Superhumans


(Originally posted 4 September 2007 on the Bias and Belief Blog)
 

Imagine someone who claims to have superhuman powers: He has x-ray vision, enough strength to lift a car and can bend metal with his bare hands, he assures you. When you say that’s unlikely, he takes it as an insult: okay, his powers aren’t at their best right now, but it’s plain to him that he is super-human. You might find his insistence eccentric or amusing, but then you find he’s going on a long drive across the desert without taking tools, a jack or extra supplies. If the car develops a fault, he swaggers, he’ll diagnose it with super-sight, lift up the car with his super-strength, and bend the metal right back into shape.

 

No longer amused by his confidence, you try to dissuade him. A more reasonable person making the trip would be bold and adventurous, but he just seems blind. If only he would acknowledge who and what he really is, he could still have his fun, and avoid a potential disaster.

 

Now of course that story was just an analogy. You, I and the people we meet every day never claim to have super-human powers, do we? Well, consider these abilities:

 

Repeated scientific research shows us that human beings don’t have these things, any more than a cat has wings. Someone who had these powers would be, by definition, super-human. People who wrongly believe they have these powers are entirely normal. Self-identified super-humans, then, are everywhere. We all have that failing to some extent, although in some people it’s a marked personality fault. Whenever people claim to know things when they have no evidence, they are in effect saying they are super-human.

 

The fact that perception, memory and judgment are unreliable does not stop us finding genuine knowledge, just as the unreliability of a car does not mean travel is impossible. For a trip across a desert, you could prepare well, taking spare parts; all the necessary tools; and lots of supplies. Similarly, in deciding what’s true, there are techniques we can use to compensate for our own subjectivity, for example:

 

In trying to get people to think critically about what is or is not true, the main obstacle seems to be to get them to accept that their beliefs need checking, rather than that reality is just the way they see it. No one denies that scientific conclusions can sometimes turn out to be affected by individual biases, shaped by prejudice or just plain mistaken. The super-humans take this to mean that they are better off relying on their own beliefs or feelings about what’s true, but that inference ignores the fact that human perception and memory have all these problems and worse. Quoting Tavris and Aronson (2007) “Mistakes Were Made (but not by me)”:

Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control.

 

Some of the posts on this blog are about research into biases, whereas some, like this one, log my attempts to craft ever better text to describe my position on biases, hence I haven’t filled this post with references. On the distortion of decisions by irrelevant features of the environment, see most of the research by Tverksy and Kahneman and colleagues on the Anchoring heuristic, or any of the psychological research on stereotypes. On confusions between sources of memory, see Daniel Schacter’s Seven Sins of Memory research. On the illusion of introspection, see Timothy D. Wilson (2004) Strangers to Ourselves: Discovering the Adaptive Unconscious. The difficulty in teaching people to be actively open-minded is explored at length in Jonathan Baron’s Thinking and Deciding.