Hide table of contents

I see a lot of talk about biases in effective altruist circles. I agree that it is very important to acknowledge that the human mind is flawed and consistently makes many mistakes. I wish there was much more discussion of how these biases compare to one another. On many topics there are biases going in both directions and without some sense of how they compare to one another it is hard to make sense of where the bias actually ends up leading. It seems that people assume that most people are biased against their specific perspective, rather than towards it.

Imagine for example that there is a member of the LessWrong community who is considering x-risk:

 

Biases that would make this person think that x-risk is a large concern

Biases that would make this person think that x-risk is not a large concern

Anchoring - The tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject)

Ambiguity effect - The tendency to avoid options for which missing information makes the probability seem "unknown."

Availability Cascade - A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true")

Availability heuristic - The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.

Attentional bias - The tendency of our perception to be affected by our recurring thoughts


 

This is just from the “A” section of the very long list of biases from Wikipedia and I am sure that a full list would have dozens of biases going each way. I am less interested in this specific example and more interested in how people deal in general with conflicting biases. What do you think?

7

0
0

Reactions

0
0
Comments5
Sorted by Click to highlight new comments since:

Vaguely related point:

I sometimes see proponents of cause X (for almost all X) say things like "consider all the cognitive biases that would cause you not to think that cause X is the most important! Therefore you need to pay more attention to cause X." I think this is an extremely cheesy tactic--possibly even logically rude depending on how it's employed.

For many reasonable propositions you can concoct an almost infinite list of biases pushing in both directions on it. Ironically, people who use this form of argument seem to be themselves suffering from confirmation bias about the proposition "cognitive bias causes people not to believe that cause X is important"! And also a bias blind spot ("I'm less prone to cognitive bias than all those people who believe in cause Y").

I think, as this illustrates, talking about biases usually isn't that helpful when working out what to do. There are often plausible biases on both sides.

This is a pretty common criticism against behavioral finance, which attempts to use cog biases to better understand financial markets, and was one of the first major attempts at application. Theories based on biases are pretty weak unless backed up with a model or some relevant empirical evidence.

At 80k, we don't find understanding biases to be that big a part of making good career decisions. The main ways it comes up is that it raises my credence that ppl tend not to consider enough options and that it's useful to use a checklist when comparing options (i.e. be more systematic).

I think the biggest bias here is that most donors would like to be able to point to their clear successes and the people they helped. For most folks, this leans them against x-risk because a) you'll very likely fail to lower x-risk b) even if you succeed, you usually won't be able to demonstrate it.

On the other hand, it's also harder to tell if you've failed.

Like Ben, I doubt this kind of analysis is going to change people's minds much one way or the other.

The biases which Peter Hurford discusses in his classic post Why I'm skeptical about unproven causes (and you should be too) seem to be relevant here.

Curated and popular this week
Relevant opportunities