Hide table of contents

This is a concise summary of Eliezer Yudkowsky's Cognitive Biases Potentially Affecting Judgment of Global Risks. I wrote this summary since it's very important information, but a long read. Hopefully this will increase the number of longtermists/x-risk researchers who are familiar with these cognitive biases and their implications. This summary is about 11% as long as the original paper; from ~13,500 words down to ~1,500.

 

Introduction

If the world ends, it will probably be due to patterned flaws in human thinking than a deliberate act. This paper lists cognitive biases to be aware of, solving for which should reduce existential risks. 

 

1. Availability Heuristic

Availability Heuristic: a mental shortcut where people judge the frequency or probability of events based on how easily examples come to mind. 

This often leads to errors, such as overestimating the likelihood of rare events and underestimating common ones, due to biases formed by media reporting and personal experience. It's noted that experiences with small-scale hazards can create a complacent attitude toward larger risks. Consequently, this heuristic might contribute to people's failure to adequately prepare for unprecedented catastrophic events, including those that could lead to human extinction, because they are outside of our collective experience.

 

2. Hindsight Bias

Hindsight Bias: the tendency to believe that one would have correctly predicted the outcome of an event after the outcome is already known, thus overestimating its predictability. Also called "the I-knew-it-all-along effect".

A study showed that students told the outcome of historical events overestimated its predictability compared to those not informed. This bias affects legal judgments of negligence, with people more likely to deem an event as predictable after it occurs, despite instructions to avoid such bias. This bias can cause underestimation of the importance of preventive measures in avoiding disasters, such as the Challenger explosion, where earlier warning signs were not fully appreciated due to the lack of hindsight.

 

3. Black Swans

Black Swans: describes rare, unpredictable events with massive impact. These events are outliers that lie outside normal expectations and are so rare that they are not often considered until they occur. 

Hindsight bias and the tendency to overestimate the predictability of the past lead us to be ill-prepared for Black Swans. The result is a lack of preparation for unexpected events, such as the 9/11 attacks, after which specific preventative measures were taken without addressing the broader need to expect the unexpected. The prevention of such events is hard to appreciate or reward, as it's not obvious when a disaster is averted. Hence, society often fails to recognize the value of preventive measures and tends to reward heroic responses to disasters rather than the quiet avoidance of them.

 

4. The Conjunction Fallacy

The Conjunction Fallacy: a bias towards predicting that a set of multiple specific conditions are more probable than a single general one. Adding details can paradoxically make an event seem more likely, even though the probability of two events occurring together (conjunction) is always less than or equal to the probability of either one occurring alone.

For example, in one study, participants were asked to rank a set of statements from most to least probable. Statements included "Linda is a bank teller" and "Linda is a bank teller and is active in the feminist movement". Participants tended to rank the latter statement as more likely than the former. The conjunction fallacy can also be observed in individuals' willingness to pay for insurance or prevention for highly specific disaster scenarios. People overvalue the likelihood of complex scenarios and undervalue simpler ones, which can skew decision-making.

 

5. Confirmation Bias

Confirmation Bias: individuals often test hypotheses in a way that seeks confirming evidence, rather than disproving evidence. 

Taber and Lodge's experiments on political attitudes have found that even balanced arguments can polarize opinions further. Decisions are often made more quickly than we are aware, and we may be less open to changing our minds than we think. Awareness of our biases is crucial to counteracting this. 

 

6. Anchoring

Anchoring: a cognitive bias where people over-rely on initial, often irrelevant information to make decisions. 

Studies show that irrelevant numbers can influence people's estimates, a phenomenon that persists even with implausible figures and is not mitigated by incentives or warnings. The effect increases under cognitive load or pressure for quick responses. Studies find that stories, even when known to be fiction, can bias individuals' expectations and decisions. 

 

7. The Affect Heuristic

The Affect Heuristic: a mental shortcut where subjective feelings of good or bad influence judgement. People boil down information into "good" and "bad" sentiments, more so with limited time or information. 

Slovic et al. found that people were more supportive of life-saving measures when presented as a percentage saved ("saving 98% of lives", as opposed to "saving 150 lives"). In other words, measures framed in terms of a high percentage of lives saved seemed more impactful than concrete numbers, since they were closer to a perceived upper limit. Other researchers observed that analysts judged unfamiliar stocks as either good (low risk, high return) or bad (high risk, low return) based on their overall feeling, and that with more time pressure, individuals will boil nuanced views down to an overall positive or negative sentiment.

 

8. Scope Neglect

Scope Neglect: a cognitive bias where people are insensitive to the quantity of affected entities when making decisions about resource allocation or the valuation of life. 

Experiments have shown that individuals' willingness to pay for the protection of wildlife or the mitigation of risks does not scale linearly with the number of lives or entities affected. For example, the amount people are willing to pay to save birds from oil pond deaths does not significantly increase whether the number of birds is 2,000, 20,000, or 200,000. This also applies to human lives; when the number of lives at risk from contaminated water was increased by a factor of 600, the average amount people were willing to pay only increased by a factor of 4. Psychophysical numbing, a concept related to Weber's Law, suggests that humans perceive and value lives on a logarithmic scale rather than a linear one. Consequently, a large increase in scope leads to a relatively small increase in emotional response or valuation. 

These findings suggest that emotional responses to large-scale problems are not proportionately stronger than those to small-scale ones, and this can have harmful implications for public policy and charitable giving.

 

9. Overconfidence

Overconfidence: people often demonstrate unwarranted faith in their knowledge, predictions, or estimates. 

This tendency is highlighted through various experiments that show individuals and experts alike assigning high confidence levels to their judgments, only to be proven incorrect at rates much higher than their confidence intervals would suggest. 

Overconfidence also leads to what is known as the planning fallacy, where people underestimate the time, costs, and risks of future actions, which can seriously reduce the efficacy of planning.

Luckily, research shows that providing additional information and training on calibration can improve judgement accuracy. Self-awareness of this  cognitive bias is crucial for better decision-making and for being more receptive to evidence that contradicts one’s preconceived notions.

 

10. Bystander Apathy

Bystander Apathy: a behavior pattern where people fail to act because they believe someone else will step in. 

The concept of diffusion of responsibility plays a crucial role here, as individuals feel less compelled to act when the sense of personal responsibility is diluted among the crowd. It has been observed that individuals are more likely to evacuate a room filling with smoke if they are alone, and less likely to evacuate if there are others in the room doing the same. Collective misunderstanding of a situation arises due to the inaction of others, which reinforces individual inaction.

This can have significant implications, not only in small-scale emergencies but also in global crises, such as existential risks to humanity. The same psychological factors that lead to inaction in the face of a room filling with smoke may also contribute to inaction in the face of global risks, as individuals may look to others for cues on how to act, leading to a dangerous cycle of inaction. 

To overcome Bystander Apathy, singling out individuals and making direct appeals for help can be effective. This tactic breaks the cycle of observation and inaction, forcing individuals to confront the situation directly and acknowledge their personal responsibility to act. 

 

A Final Caution 

Understanding psychological biases is crucial, but it should not become a tool for dismissing arguments without engaging with the actual content or facts presented. Yudkowsky warns against using psychological knowledge as a shortcut to critique arguments without the necessary technical expertise. Claims should be evaluated based on substance/evidence, not on the psychological profile of the person making the claim or the literary style in which it's presented. He suggests that such a practice can lead to superficial debates that are more about the psychology of the participants than about the actual issues at hand.

 

Conclusion

Thinking about existential risks is subject to the same cognitive errors as any other kind of thinking; the high stakes involved do not inherently make our thinking clearer or more rational. An awareness of heuristics and biases is essential for anyone working on existential risks. The point is to combine domain-specific expertise with a wider understanding of cognitive biases to better predict and mitigate potential disasters. 

5

1
0

Reactions

1
0

More posts like this

Comments2
Sorted by Click to highlight new comments since: Today at 4:10 AM

Executive summary: Cognitive biases like availability heuristic and confirmation bias can distort judgement about risks, especially unprecedented ones like human extinction.

Key points:

  1. Availability heuristic leads people to underestimate common risks and overestimate memorable ones. This could downplay preparations for unprecedented catastrophes.
  2. Hindsight bias makes past events seem more predictable than they were. This may undermine appreciation for disaster prevention efforts.
  3. Black swan events are highly impactful but unpredictable, so lack of preparation for them is dangerous.
  4. Conjunction fallacy makes complex scenarios seem more likely than they are. This can skew assessments of risk.
  5. Confirmation bias leads people to seek out confirming evidence rather than critically testing hypotheses.
  6. Anchoring causes people to rely too heavily on initial irrelevant information when making judgments.
  7. Affect heuristic boils information down to good/bad feelings, more so under time pressure. This can distort risk analysis.
  8. Scope neglect means people do not scale concern proportionately to affected entities. So large problems may get inadequate responses.
  9. Overconfidence leads to underestimating the costs, risks and timelines of plans and actions.
  10. Bystander apathy causes inaction when people expect others to act. This can propagate collective inaction even in crises.

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Thanks very much!

Two questions: 

(1) I have had to introduce cognitive biases to people myself and was wondering what reasons to give for the the precise selection of biases I present. As far as these 10 biases are concerned: Is it just a judgement call to choose this specific list of biases --- as they kind of seem practically relevant and much discussed? Or is there a more systematic reason for choosing these 10 or any other list?

(2) Yudkowsky's list is from 2008. Much has happened since. It would be nice if there were kind of a running update on which cognitive biases have moved up / down over time in terms of being supported by the evidence.

Curated and popular this week
Relevant opportunities