| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions! Dokkio, a new product from the PBworks team, integrates and organizes your Drive, Dropbox, Box, Slack and Gmail files. Sign up for free.

View
 

Bias introduction

Page history last edited by PBworks 12 years, 8 months ago

The Varieties of Bias (Draft)

 

Imagine you are carrying out a political assassination with a sniper rifle. As the cruel dictator stands on the podium and you line up your shot, there are a number of reasons why you might not be sure of hitting your target.

 

The first is uncertainty. You know the area where the bullet will hit, but you can't be certain of the exact spot. You might not be an accurate shot at that distance. You might shake with nerves. Alternatively, the scope might not be focused properly so you can't see. In these circumstances, you expect that if you fire off a lot of bullets in succession, most will hit the dictator, but you can't know in advance which ones.

 

The second is systematic bias, where the bullet consistently lands, say, a metre to the left of where you aim. This might be because the scope isn't aligned properly. It might mean the barrel of the rifle has a defect, or it might be because you are not used to the gun. If you are aware of systematic bias, then you can compensate for it by deliberately aiming to the right.

 

There is a third kind of bias which is also systematic, but different from a mis-aligned gun barrel. Imagine that, at the moment you have a clear shot on the dictator, next to him on the podium is the little girl who led the marching band. Although we are imagining you as an assassin, you will not take innocent life at any cost (let's say you're in a John Woo movie). So you aim well to the other side. This means you will most likely miss the dictator, and have a small chance of hitting him, but it's worth it to have no chance at all of hitting the child. So although you will most likely miss, it is not a defect of the gun or of your marksmanship, it is because killing the dictator is not your only goal: not the only thing you value. You could call this value bias or value conflict.

 

In order to call something a bias rather than merely a conflict between values, there has to be a clear, objective goal. This is not always the case. Alice says that the main goal of a corporation is to make a profit for shareholders. Bob thinks that shareholders are only one group of interests to consider, and that a corporation has responsibilities to employees, customers and the wider community as well. When the company makes employees redundant to maximise profit, Alice sees this as a rational, unbiased decision but from Bob's perspective, a decision about the welfare of the community has been biased by the power of the shareholders (Rubinstein (2006)). So, when calling something a bias, we need a well-defined goal.

 

There are linguistic acts - acts of expressing or forming an opinion - which have the goal of truth (and related goals like accuracy, informativeness, explanation). Not all use of language is declarative or literal (for example, people write poetry, tell jokes and act in plays), but when we do, it is intended it to be taken as true. Even though our reasons for having that opinion may not have much to do with its truth or falsity, we don't expose that to the

 

Like bullets, opinions might "hit" (i.e. be meaningful, true and informative) or "miss". They might miss for the same three reasons.

 

Uncertainty is inevitable. We lack a perfectly reliable way of distinguishing truth from falsehood. Human beings have limitations (note that a limitation is different from a bias). We cannot perceive things outside a range of human experience, reasoning can be muddled and memories fade. We can overcome those limitations using scientific techniques of precise measurement, corroboration, statistical analysis and so on, and in some cases achieve very strong evidence, but those methods still have their own certainty, even though the best science can make it vanishingly small.

 

Another problem is that people can have systematic biases, like the gun barrel that veers to the left. Cognitive psychology has identified hundreds of weaknesses and errors in human judgement and memory, and an astonishing finding is that a large proportion of these are systematic, not random.

 

Cognitive errors are systematic in the same way as optical illusions (Piattelli-Palmarini (1994)). If you put two identical shapes against a special background, you can create an optical illusion in which one appears larger or more irregularly shaped. People all see the illusion the same way (they do not disagree about which of the two looks bigger) so this is systematic bias rather than uncertainty.

 

Value biases in the context of opinion have been called "non-epistemic values" (McMullin (1982)). Examples are easy to find in science. At the individual level there can be political motivations that bias the interpretation of certain research, as when sexual stereotypes affect interpretation of research on gender differences. The demands of a career can bias academics towards positive results, and the need for a gripping story can bias journalists towards reporting new research as shocking or groundbreaking. Research funded by the cigarette industry has an incentive to deny the link between smoking and cancer. Reports by creationist religious groups have an incentive to find problems with the standard scientific picture of our origins. While value biases are well studied in science, mundane, everyday reasoning is no less vulnerable. While it is usually preferable to have true opinions rather than false ones, people want to have comforting opinions, opinions that justify their actions, opinions that allow them membership of social groups and so on. So opinions can be attractive for reasons that have nothing to do with their truth or falsehood.

 

This book is primarily about value bias of opinion. It investigates the distinction between epistemic (scientific) values, value biases and other kinds of value. It argues that in a lot of cases where people's opinions seem disconnected from reality, what appears to be systematic bias is actually value bias. In other words, what looks like irrationality is often actually the rational pursuit of a non-epistemic goal.

Comments (0)

You don't have permission to comment on this page.