| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Who Wants to Be a True Believer

This version was saved 16 years, 8 months ago View current version     Page history
Saved by PBworks
on August 3, 2007 at 10:36:51 am
 

Here is my idea for a game show. It is not going to make me a hot-shot TV producer but it will serve to illustrate some philosophical points about the explanation of strange beliefs. In my game show, an immaculate, smiling host tests contestants’ general knowledge by reading out a statement. The contestant then has to say “True” or “False” and, depending on their answer, can win up to a thousand pounds.

When you visualise this show, you probably add in a piece of information which isn’t in my description; that the contestant wins the money for correctly stating whether the statement is true or false. We are used to quizzes that motivate the contestant to give correct answers. In philosophical terminology, these are epistemically motivating situations. One non-epistemically motivating situation would be a game show where the contestant is paid a thousand pounds for answering “True”, no matter whether the statement they’ve heard is true or false. This is an absurd situation and would make for very boring television, but in an important respect it is strongly analogous to situations in which human beings find themselves every day, especially when forming opinions about themselves, or about certain topics in the areas of religion or the paranormal. Individual motivations, for example towards psychological comfort or towards a positive self-image, can take the role of the cash prize as motivations to affirm or deny statements irrespective of their truth or falsehood. Likewise, social factors such as pressure to conform, to get on with other people or to please one’s superiors can reward or punish the opinions that we voice.

 

It would be incorrect to talk of belief per se being directly motivated, since belief itself is not a matter of choice. If I were offered a million pounds to believe that the Earth is flat, I would enthusiastically say that it is flat and otherwise imitate someone who has that belief, but the reward itself would not shake my belief in the roundness of the Earth. Hence it makes sense to distinguish belief itself from a set of behaviours including assertion, denial or agreement as well as information-gathering behaviour, such as the choice of what to read, what meetings to attend or whether to conduct experiments. Some philosophers (most prominently Dennett) distinguish belief from opinion. Opinion, put simply, is what someone expresses in language when you ask them whether or not a given statement is true. Belief is a more subtle concept, which we can gloss as information about the world which underlies and explains an agent’s behaviour. Put most colloquially, as a guide to belief actions speak louder than words.

 

Although belief itself cannot be regarded as a matter of choice, information-gathering behaviour can be. This gives an indirect way in which belief can be affected by motivation. If I wanted to win a million-pound prize for believing that the Earth is flat, I could seek out flat-earthers, read their publications and, more importantly, actively avoid reliable sources of information or rational debate on the subject. In the long term, my belief may at least shift slightly towards the flat-earth position.

 

The success or failure of such an indirect change of belief depends to a large extent on the surrounding culture. Nowadays, it would be almost impossible to set about avoiding evidence that the Earth is round, but if I wanted to believe that the diversity and richness of life on Earth arose through special creation rather than evolutionary processes I would have a much easier task. The roomfuls of scientific evidence for evolutionary theory are mostly tucked away in university libraries, whereas it is easy to come across people or groups who will eagerly decry the theory and encourage me to do the same.

 

When you first encounter it, it is easy to mistake the epistemic/non-epistemic distinction for the distinction between honesty and lying. After all, they both seem to be cases where you think one thing and say another. We need to push the game show analogy further to see why this is not the case. When we form opinions on some matters, we are rewarded or punished not by cash prizes or other forms of external reward but from our own inner feelings. Contemplating the idea that the used car I just bought was not worth anything near what I paid is an uncomfortable experience. If my intellectual conscience allows any alternative, I would rather contemplate that alternative. On the other hand, joining the Loyal Order of the Bongo and affirming to myself “I am now a Bongoist” can carry with it positive feelings. There is a feeling of common purpose with other Bongoists around the world, of inheriting the historical achievements of the great Bongoists, or of being one of the privileged minority that knows the secrets of the Bongo. Situations of this sort show us that non-epistemic motivation can affect what people admit to themselves in just the same way as it shapes the opinions expressed to other people. This is why non-epistemic motivation covers many more cases than simple lying.

 

The questions of what epistemic or non-epistemic motivations affect people’s thinking are empirical questions to be explored through psychology and sociology. A helpful pointer comes from the well-replicated research on cognitive dissonance. This is a phenomenon in which people reform their opinions, behaviour or memories to defend their self-image, especially positive aspects of their self-image (Aronson (1994)). This tells us that if we are looking for matters on which people are non-epistemically motivated we should look at issues that involve them personally, especially ones in which their personal worth is in question. People are the sort of creatures who find it easy to reason purely epistemically about the height of Everest or the average weight of an adult rabbit but difficult to take that same attitude to questions such as whether they are well-liked, whether their life so far has reflected their true potential and whether they will survive bodily death.

 

Specific non-epistemic motivations can derive from positive things that people want to think about themselves. A thought like “I am a great intellect” may well drive the isolated souls who convince themselves that they have invented perpetual motion or refuted special relativity. A thought like “I can help other people with their problems” might underlie the behaviour of someone who presents themselves as having healing powers, while “My chronic illness can be healed” might underlie the eager acceptance of people who go to such a healer. Other positive thoughts that people can have about themselves are that bad things that happen to them are not their fault or that their lives are meaningful. Such thoughts may underlie grand conspiracy theories or therapy systems that picture people as crippled victims of past experiences.

 

Some opinions may be non-epistemically motivated because they attach to desirable social roles, such as medium, healer or abuse survivor. Others might be purely recreational, in that they provide something psychologically which people do not get from a permanent focus on mundane reality. This could be freedom from the worries or tensions of everyday responsibility, a sense of adventure and meaningfulness, or just the possibility of new and unusual experiences.

 

This is not to say anything bad about recreation or the pursuit of new experience. Scientists, mathematicians and philosophers are driven by motivations which include the search for beauty and the fun of discovering something completely new. When someone is motivated to explore a question in search of the truth, they are not indulging in recreational belief. It only counts as such when the recreational motivation bypasses inquiry and determines the content of the opinion.

 

As well as these psychological investigations, there is a philosophical task to be done; of using thought experiments such as the non-epistemic game show to extract purely theoretical points about epistemic and non-epistemic motivation.

 

For example, if you were on the game show where you won a cash prize simply for saying that what the host says is true, you would give the same answer to every question. Your knowledge or critical thinking ability would not come into use because they are not being demanded by the game. Now take the analogous situation of someone who holds absurd opinions because those opinions offer hope for a bright future and are shared by everyone in the local church. The failure to assess those opinions against evidence is not necessarily down to a lack of ability, but is down to the motivation of that person in that situation. The social “game” that person is playing is that of finding and maintaining a meaningful social role which gets them respect, companionship and/or a sense of purpose rather than that of coming to the truth.

 

There is an important lesson here for those who want to promote skeptical thinking. Appalled by the lack of critical thinking in the populace, scientists and educators may diagnose the problem as a lack of thinking skill, and try to remedy this by training people to spot fallacies or to argue with sophisticated logic. If the failures of critical thinking are actually failures of motivation, such as conformity or an attachment to positive assessments of oneself, then this training can make the original problem worse. If we teach people more principles of critical thinking without addressing the motivational biases, then we are only enabling them to more effectively defend their biased positions from legitimate criticism.

 

An educational process that addressed the motivational problem would teach people that serious inquiry can be a rewarding process and alert people to the cognitive and motivational biases that can detract from it. Most importantly, it would encourage people to distance themselves from their own opinions so that threats to those opinions do not provoke the cognitive dissonance process that protects a positive self image.

 

Epistemic and non-epistemic motivation are not easy to distinguish in practical contexts because someone whose opinion is non-epistemically motivated is, in effect, acting out a role and they may well be a very good actor. The differences will show themselves most clearly in responses to further information.

 

For someone who is epistemically motivated, a source of reliable information is a valuable thing. Imagine that you are asked a true/false question with a prize of a thousand pounds for the correct answer. If you have no idea what the right answer is and you guess, your expected gain is (½ x 1000) + (½ x 0) or five hundred pounds. If you know the right answer, then your expected gain is the full thousand, so in this case one bit of reliable information is worth five hundred pounds.

 

If, conversely, you have a strong non-epistemic attachment to your opinion, then information does not have a positive value. Imagine again the strange game show where you are paid money simply to answer “True” to the true-or-false questions. If you were offered a crib sheet telling you which statements were true and which false, it would be no use to you because it would not increase the value of the prize you are easily able to get.

 

Not only is a reliable information source (whether that is a person, a publication or an experimental set-up) useless when you are non-epistemically motivated, it might actually be aversive; something that it is in your interests to avoid. There is naturally a social pressure to appear reasonable and to express coherent beliefs. Hence if you have a non-epistemic desire to affirm opinion X, then you should avoid a situation where you have to acknowledge that some evidence or reasonable argument conflicts with X. This aversion to information is a hallmark of non-epistemic motivation, because information can never have a negative value to someone who is epistemically motivated. Information aversion can show itself in minor decisions, such as how long you put off looking at your latest credit card statement. On the larger social scale, information aversion might appear as anti-intellectual or anti-science sentiment.

 

Another area where we should expect a significant behavioural difference is in the context of debate. If your motivation is to find the truth, an articulate critic of your opinions is potentially a valuable ally. It is in your interest to co-operate with such a person to gain information and to root out your own mistakes. On the other hand, if your opinions are held non-epistemically, then such a person is a nuisance because you neither want to give up your psychologically rewarding opinion nor justify it. In such a situation it is in your interest to do whatever will discourage that person from questioning or criticising you, whether that is changing the subject, using personal attacks, or simply avoiding any open debate or testing of your opinion.

 

There are two crucially different concepts of rationality at work here. One is the concept of scientific rationality; of the appropriate use of rational standards of logic and evidence. The other is the concept of instrumental rationality; of behaving in a way that achieves one’s goals. They only pull in the same direction when the agent’s aims are purely epistemic. People who accept absurd or indefensible opinions, whom we might chastise for their irrationality, may well be highly rational in the latter sense.

 

To the question of why an increase in the success of science and technology has been accompanied by increasingly unscientific or anti-scientific attitudes amongst the general public, the current perspective offers two distinctive possible explanations, in addition to whatever other possible explanations we come up with. The first presumes that until now people have accepted scientific authority non-epistemically. That is, they have not really understood or been convinced by scientific pronouncements and may not have taken on scientific beliefs. Their real unscientific beliefs have not manifested in social behaviour, such as joining a new age group or discussing paranormal topics openly with other people, because there were clearer cultural taboos against them. As culture becomes less conformist and people feel more psychological freedom, there is an explosion of such behaviours.

 

Another possible explanation hinges on the recreational nature of the unscientific ideas. Opinions that are due to simple error should be expected to die out as more accurate information becomes more freely available, but where the opinions are non-epistemically motivated because they attach to recreational behaviour, the availability of accurate information or of opportunities to put claims to the test may be felt as a threat or a nuisance. To defend themselves from this, people become more openly hostile to scientific inquiry.

 

In a very well-to-do area near to where I live, an evangelical Christian church is flourishing. Attending one of its meetings, I was struck by the contrast between the uninhibited, emotionally open behaviour of singing, hugging and shaking allowed in the church and the conventional world outside where such behaviour would be unacceptable. Whether or not the church’s doctrine is true (which I am not examining here), this recreational aspect of the church provides a heavy non-epistemic motivation to accept other activities and opinions such as the church’s preferred interpretation of the Bible. Perhaps those other opinions can be considered an “entry fee” for the church, which many people are willing to pay. Though the appeal of the church is more understandable when put in these cost/benefit terms, that does not mean that we have to agree with it. In fact we can and should object if they take it so seriously that they teach their children that evolution is wrong and evil, if they promote obedience as a great intellectual and moral virtue, or if they denounce medical treatment in favour of prayer.[i]

 

If the difference between the skeptic and the believer is down to values, or down to vague traits such as intellectual conscience, then it might seem questionable why the skeptic should criticise the believer. We traditionally treat values as beyond rational debate, unlike purely factual matters. The same goes for other issues of taste or of lifestyle. I see two lines of response to this.

Firstly, the skeptic can point out that a person’s lifestyle choice can be criticised in so far as it has negative consequences for other people. This applies just as much to recreational belief and its consequences. If your recreational belief that you are a psychic sleuth goes as far as phoning the police to give spurious “leads” on a murder case, then you bear responsibility for the consequent waste of police time. The fact that you were self-deceived rather than merely deceiving is no excuse.[ii]

 

Secondly, recreational belief is an example of an all-too-common human failing; that of focusing on the immediate effects of our actions while neglecting the future or distant consequences. Someone who spends all their money now rather than saving for emergencies or retirement is making this kind of error, as is a teenager too fixated on soap operas or computer games to revise for an exam. Criticism in these cases goes beyond mere disapproval of someone else’s lifestyle, because these people are arguably going against their own self-interest. Similarly, when we make do with easily available, comforting answers to scientific questions or human problems we are passing up opportunities to broaden our minds with the often rich and beautiful explanations that genuine science brings up.

 

In conclusion, irrational reasoning, even when severely irrational in the scientific sense, can turn out to be rational in an underlying sense when the motivations behind it are explored. This does not negate the fact that irrationality can have very severe consequences. What it does tell us is that in dealing with irrationality we may need to understand motivations and personal values as much as we understand logic and evidence.

 

References

Aronson, Eliot. 1994. The Social Animal. New York : W.H. Freeman

Clifford, William Kingdon. 1901. The Ethics of Belief in Stephen, Leslie and Sir Frederick Pollock (Eds.) Lectures and Essays by the Late William Kingdon Clifford. London : Macmillan

Randi, James. 1989. The Faith Healers. Buffalo, NewYork : Prometheus



[i] Randi draws a significant parallel between Christian evangelism and overt entertainment in Chapter 16 of The Faith Healers (Randi (1989)).

[ii] Clifford (1901) gives a series of further examples of harmful behaviour stemming form commitment to belief, and uses this to argue that ethical responsibility attaches to such commitment.

Comments (0)

You don't have permission to comment on this page.