Rationally, your political values shouldn't affect your factual beliefs. Nevertheless, that often happens. Many factual issues are politically controversial - typically because the true answer makes a certain political course of action more plausible - and on those issues, many partisans tend to disregard politically uncomfortable evidence.
This sort of political bias has been demonstrated in a large number of psychological studies. For instance, Yale professor Dan Kahan and his collaborators showed in a fascinating experiment that on politically controversial questions, people are quite likely to commit mathematical mistakes that help them retain their beliefs, but much less likely to commit mistakes that would force them to give up those belies. Examples like this abound in the literature.
Political bias is likely to be a major cause of misguided policies in democracies (even the main one according to economist Bryan Caplan). If they don’t have any special reason not to, people without special knowledge defer to the scientific consensus on technical issues. Thus, they do not interfere with the experts, who normally get things right. On politically controversial issues, however, they often let their political bias win over science and evidence, which means they’ll end up with false beliefs. And, in a democracy voters having systematically false beliefs obviously more often than not translates into misguided policy.
Can we reduce this kind of political bias? I’m fairly hopeful. One reason for optimism is that debiasing generally seems to be possible to at least some extent. This optimism of mine was strengthened by participating in a CFAR workshop last year. Political bias seems not to be fundamentally different from other kinds of biases and should thus be reducible too. But obviously one could argue against this view of mine. I’m happy to discuss this issue further.
Another reason for optimism is that it seems that the level of political bias is actually lower today than it was historically. People are better at judging politically controversial issues in a detached, scientific way today than they were in, say, the 14th century. This shows that progress is possible. There seems to be no reason to believe it couldn’t continue.
A third reason for optimism is that there seems to be a strong norm against political bias. Few people are consciously and intentionally politically biased. Instead most people seem to believe themselves to be politically rational, and hold that as a very important value (or so I believe). They fail to see their own biases due to the bias blind spot (which disables us from seeing our own biases).
Thus if you could somehow make it salient to people that they are biased, they would actually want to change. And if others saw how biased they are, the incentives to debias would be even stronger.
There are many ways in which you could make political bias salient. For instance, you could meticulously go through political debaters’ arguments and point out fallacies, like I have done on my blog. I will post more about that later. Here I want to focus on another method, however, namely a political bias test which I have constructed with ClearerThinking, run by EA-member Spencer Greenberg. Since learning how the test works might make you answer a bit differently, I will not explain how the test works here, but instead refer either to the explanatory sections of the test, or to Jess Whittlestone’s (also an EA member) Vox.com-article.
Our hope is of course that people taking the test might start thinking more both about their own biases, and about the problem of political bias in general. We want this important topic to be discussed more. Our test is produced for the American market, but hopefully, it could work as a generic template for bias tests in other countries (akin to the Political Compass or Voting Advice Applications).
Here is a guide for making new bias tests (where the main criticisms of our test are also discussed). Also, we hope that the test could inspire academic psychologists and political scientists to construct full-blown scientific political bias tests.
This does not mean, however, that we think that such bias tests in themselves will get rid of the problem of political bias. We need to attack the problem of political bias from many other angles as well. I will return to this problem in later posts.
Interesting test. I scored quite low in terms of political bias, but there's certainly a temptation to correct or over-correct for your biases when you're finding it very hard to choose between the options.
Great resource. The google doc guide is a really nice touch - very helpful. Two comments:
1) Re: 'don't know' answers. A confidence/credence slider may help. This would allow people to give more fine-grained responses. Scores could be modified to some version of credence*correctness. Then you don't punish those who don't know they don't know whether it was a 2% or 1% renewables increase (ignorant, not biased), and increase the punishment for those who are 100% sure that the wrong answer is correct (biased). This would retain the punishment for always leaning the same way. An example here: http://www.2pih.com/caltest/
2) The questions must be tough to phrase, but I found a couple ambiguous.
Q5 on foreign aid: 'minor reason' and 'barely a reason' sounded synonymous to me. I picked 'minor' because any aid has an opportunity cost to domestic spending. I know the source can't really help with this.
Q6 on emissions. It may help to clarify that this is production, not consumption emissions. (Having briefly looked into it, however, this doesn't seem to make much of a difference).
Q17 on emigration destination popularity. I read this as 'US compared to [ALL COMBINED] other countries', as opposed to 'US compared to [INDIVIDUAL] other countries', which changes the answer. Silly on my part, but might be worth clarifying.
This test, though it has good intentions, has several possible flaws:
I suppose it's looking for a concordance between people's political beliefs and their views on factual issues? But this type of concordance isn't necessarily indicative of political bias: if someone believes certain facts, they're going to base their political beliefs on those facts. For example, it would be bizarre if someone believed that the welfare state increases poverty but also identified as a Democrat. Like, "You do realize what the Democrats support, right?"
The "facts" in this test are not necessarily the product of widespread consensus. For example, did the stimulus package reduce unemployment compared to the counterfactual? I don't think that's a settled issue.
This test is easy to game, just as it's easy to get a diagnosis of ADHD by answering "strongly agree" to statements like "I have trouble paying attention". Many questions are clearly trick questions, which everyone knows how to answer.
There's little opportunity to say "I don't know." Did renewable energy increase by 1 or 2 percent? Hell if I know. Is that considered a big difference? I just had to guess.
1) This question is discussed at length in the sections after the test.
2) It is according to our source. But some of the questions could have been better phrased. We will update them.
3) I wouldn't say it's easy to game. In fact, saying that it is a bias test had little effect in our Mechanical Turk pre-tests, which suggests that most people don't try to game it. That said, it is possible to game. It is very hard to construct a test like this which is impossible to game. Other similar tests are far easier to game (see, e.g. Hans Rosling's test of global progress which in effect is a bias test).
The test obviously isn't going to be a reliable measure of bias if people try to game it - if they try to be more unbiased than they normally are. Still, taking the test could make these people think why they don't normally adjust for their biases in this way. Hence the test could to some extent fulfill its ultimate purpose - getting people to think more about their biases and of the problem of political bias - even when it isn't accurate as a measure of political bias.
4) True, but if you consistently guess in a direction that favours your political opinions, that suggests bias. That said, ideally it should be possible to indicate how strong your confidence in your answers is.
This is a very interesting approach, which I praise. However, the test has some ceiling effects (Sally Murray noted that the percentiles for a given score have been plummeting). It might help to have more and harder questions, or to use continuous variables to pick up finer distinctions.
Thanks, Carl!
Yes, you're right about the ceiling effects. We started out with questions where science hasn't established an answer, but where we guesstimated that the probability that the conservative/liberal answer is right was 50 % on average. That system wouldn't have had these ceiling effects. We ran a pre-test of that test on Mechanical Turk, but the results were very askew (conservatives came out as much less biased than liberals, which was probably due to poorly constructed questions). I then decided to abandon that strategy for this one.
Here are two other posts I've written on this general strategy of inferring bias from belief structures. If you have any ideas of how to devise smarter ways to develop bias tests, I'd be very interested in that.
Hi Stefan, It is interesting to consider the possibility of people making more rational decisions when voting if they could learn to be more self-aware of their own personal biases. I hope that was a fair summary. I have a number of questions to throw at this concept: 1 - Does it really matter what the voters think anyway (is it a true democracy)? http://www.commondreams.org/views/2014/04/14/us-oligarchy-not-democracy-says-scientific-study 2 - is it reasonable to expect that people would welcome their world views be challenged and debunked? (Or is this intended to be enforced?) 3 - What actually causes this problem? Can it be "prevented" instead of "cured"?
Keep the ideas coming. Great job.