Hide table of contents

Are we not being imaginative enough when thinking about scenarios of climate caused catastrophes?

From my understanding, the broader EA community and literature generally view climate change as dangerous but having a relatively low existential risk. For example, on the 80000 hours site, estimates given for the existential risk posed by climate change is around 1 in 10,000, whereas other cause areas such as AI risk are estimated to be orders of magnitude higher at 1 in 10

I believe that the existential risk posed by climate change is underestimated in the EA community. This post on the forum provides good arguments on why relatively high certainty on estimation of the amount of warming does not mean high certainty on estimation of the amount of damage such warming can cause. Uncertainties in estimating the damages posed by can significantly impact how high the existential risk posed by climate change is.

When thinking about damages climate change may cause and whether they can be existential in scale, we tend to defer to climate economic models that attempt to model potential damages from various levels of increase in temperature. Such models predict almost insignificant drop in GDP resulting from climate change, and these results often intepretated to suggest that the potential damage from climate change is likely to be insignificant, and therefore not existential in nature.

However, it seems when discussing other risks such as AI risk, much more speculative scenarios are used to justify that mis-aligned AI lead to existential consequences. For example, killer nanobots and spreading lethal designer DNA sequences. Of course, it makes sense to speculate scenarios for AI risk because it's difficult to model something that does not exist yet and we don't understand well. AGI is also taken to be an entity that is unconstrained in its abilities and hence any speculative scenarios could be justified somehow. The potential scenarios stemming from climate change on the other hand, is a lot more constrained.

This is perhaps why many people have found it hard to imagine how climate change could lead to existential catastophe. Climate scientists also tends to be more conservative in modeling consequences of climate change and avoid speculative scenarios. 

I think maybe it's the case that for climate change, people are not being imaginative/speculative enough. If we apply the same level of speculative thinking about AI risks to climate change, we can too come up with highly existential-risky catastrophes. Here are some possible such scenarios. I do not have any probabilities estimates for any of these happening, these are purely an attempt at speculating some scenarios.

Crop failure induced nuclear war

With an increase in global temperatures, some areas of the Earth will experience much higher increase in temperatures. The probability of heat wave occuring will also increase significantly. South Asia is one region that is already experiencing hot summers and will see even higher temperatures with global warming.

A prolonged drought and heat wave, say lasting 1 - 2 months during the growing season of wheat crops in South Asia destroys 50% of India and Pakistan's wheat harvest for the year due to heat stress. A destruction of 50% of the wheat crop brings a shortage of 60 million tons in staple food in India, and 12.5 million tons in Pakistan. This is about 40% of the global export volume of wheat, which many other countries without food self-sufficiency normally buys up. Therefore there may be a lack of spare capacity to absorb this shortfall. 

Facing famines,Pakistan invades neighboring Iran, while India invades mainland southeast Asia to obtain food. Nuclear exchange break out between the nuclear powers, and soon escalates to global nuclear war leading to the near extinction of humans.

Failed transition to renewables leads to runaway warming and resource exhaustion

Transition to renewables require huge amounts of minerals, including copper, lithium, graphite and rare earth metals. Some studies suggest that the amount of minerals for a full transition to renewables may be beyond the Earth's reserves. If correct, this means that a full transition is physically impossible to accomplish.

A failed transition means that the any IPCC projected scenarios that assumes carbon neutrality are called into questions. We may potentially expect temperatures to increase following trajectories closer to RCP8.5. High warming beyond certain tipping points may trigger runaway warming due to feedbacks not well understood, such as methane release from permafrost or clathrates. A runaway warming that get to the level which render much of low and mid latitude Earth uninhabitable for humans, will significantly curtail human potential and probably lead to societal collapse well before that.

On the other hand, a half completed transition to renewables may end up depleting all of Earth's mineral resources of key metals such as lithium or copper. Humans on a mineral resource exhausted Earth gets locked in on Earth for the forseeable future like Easter islanders trapped after all trees were chopped down.

Toxic cyanobacteria blooms destroys coastal cities

Cyanobacteria blooms have been recorded during past climate induced mass extinctions. With rising temperatures, cyanobacteria bloom occur with greater frequency. Cyanobacteria releases toxins known as cyanotoxins, and decaying blooms further releases toxic gases such as hydrogen sulfide

Huge cyanobacteria blooms become common with favorable growing conditions similar to past conditions in Earth history. Coastal regions and inland lakes are periodically covered by cyanobacteria blooms during summer. The toxins released render most tropical and temperate coastal and lakeshore cities uninhabitable. Some 40% of the global population lives in coastal regions and likely generates an even greater proportion of GDP. Toxic cyanobacteria bloom causes half of the global economic stock to be abandoned, severely curtailing human developments as every country is now a landlocked country. Freshwater polluted by cyanobacteria blooms lead to severe water stress in some parts of the world. Wars over water and dam control could also break out leading to similar outcomes as scenario 1.

Final thoughts

None of the above scenarios are rigorously researched and have good estimates of probabilities, and the actual probabilities of them happening could be very low, but I would argue the same for individual scenarios such as the killer nanobot scenarios in AI risk. A lot of creativity and speculations are used to think of AI takeover scenarios (often with weak justification and lot's of hand-waving). It doesn't seem people apply the same creativity when thinking about climate caused existential risk scenarios. 

Could it be the case that either the community is not thinking hard enough about possible climate existential risk scenarios, and therefore underestimating the existential risk of climate change; or that the community is thinking too hard/speculatively about AI risks (motivated thinking?), and not applying the same standards when evaluting all cause areas.

Then again, it may well be true that the nature of AI risk is such that there are a million possible scenarios which AI can kill everyone, since there are no constraints on what the AI can do. But climate change risk is at least constrained in the sense that the initiating event has to be climate triggered. Hence the "scenario space" for climate change risk is likely smaller than AI risk. 

26

0
0

Reactions

0
0

More posts like this

Comments2
Sorted by Click to highlight new comments since: Today at 3:19 AM

Thanks for this thoughtful post! I think I stand by my 1 in 10,000 estimate despite this.

A few short reasons: 

  • Broad things: First, these scenarios and scenarios like them are highly conjunctive (many rare things need to happen), which makes any one scenario unlikely (although of course there may be many such scenarios). Second, I think these and similar scenarios are reason to think there may be a large catastrophe, but large and existential are a long way apart. (I discuss this a bit here but don't come to a strong overall conclusion. More work on this would be great.)
  • On inducing nuclear war:  My estimate of the direct risk of nuclear war is 1 in 10,000, and the indirect risk is 1 in 1,000. It seems like the chances that climate change  causes a nuclear war, weighted by the extent to which the war was more likely by virtue of climate change and not e.g. geopolitical tensions unrelated to climate change is, while subjective and difficult to judge, probably much less than 10%. If it's say 1%, this gives less than 1 in 100,000 indirect x-risk from climate change. This seems a bit small, but consistent with my 1 in 10,000 estimate. Note this includes inducing nuclear war from ways other than crop failure.
  • On runaway warming: My understanding is that the main limit here is how many fossil fuels it's possible to recover from the ground - see more here. Even taking into account uncertainty and huge model error, it seems highly unlikely that we'll end up with runaway warming that itself leads to extinction. I'd also add that lots of the reduction in risk occurs because climate change is a gradual catastrophe (unlike a pandemic or nuclear war), which means that, for example, we may find other emissionless technology (e.g. nuclear fusion) or get over our fear of nuclear fission, etc., reducing the risk of resource depletion. Relatedly, unless there is extremely fast runaway warming over only a few years, the gradual nature of climate change increases the chances of successful adaptation to a warmer environment. (Again, I mean adaptation to prevent an existential catastrophe - a large catastrophe that isn't quite existential seems far far more likely.)
  • On coastal cities: I'd guess the existential risk from war breaking out between great powers is also around 1 in 10,000 (within an order of magnitude or so), although I've thought about this less. So again, while cyanobacteria blooms sounds like a not-impossible way in which climate change could lead to war (personally I'd be more worried about flooding and migration crises in South Asia), I think this is all consistent with my 1 in 10,000 estimate.

If it helps at all, my subjective estimate of the risk from AI is probably around 1%, and approximately none of that comes from worrying about killer nanobots. I wrote about what an AI-caused existential catastrophe might actually look like here.

Transition to renewables require huge amounts of minerals, including copper, lithium, graphite and rare earth metals. Some studies suggest that the amount of minerals for a full transition to renewables may be beyond the Earth's reserves. If correct, this means that a full transition is physically impossible to accomplish.

Reserves are the amount of a mineral that can be extracted at a profit with current prices and technology. Even if technology does not improve, higher prices means much more mineral can be extracted profitably.

Curated and popular this week
Relevant opportunities