Epistemic status: Fairly confident

Here are my thoughts on climate change, with EAs and EA-sympathetic people as the target audience:

1. Expected annual mortality in around 2100 is likely between 200,000 and 2 million.

2. Climate change by itself should not be considered a major near-term global catastrophic, >10% chance of causing >10% of human mortality or existential, non-negligible chance of ending human civilization as we know it, by 2100.

3. The effects of climate change on animal welfare is basically unstudied, and relies on crucial considerations we do not know the answers of.

4. It's rather unlikely that mainline climate change mitigation efforts are more cost-effective for improving quality of life or health outcomes for current living people than mainline global health or development spending.

5. The first 3 of the above points are relatively non-controversial among experts.

6. I consider myself fairly ignorant on this topic, but most educated laymen (including journalists and activists) who talk about climate change a lot on all sides seem even more ignorant than I am.

7. Depending on your moral preferences, other beliefs about the world, and general flexibility, it is probably wise to devote your altruistic energies to working on other cause areas.

8. EAs should be careful about messaging and turning off climate change-focused people, as there is a lot of strategic overlap including a quantitative focus, caring about “big problems” and frequently a longtermist outlook.

Meta: I'm experimenting with a new way to use the EA Forum. Instead of a top-level post explaining everything, I'll just have the eight main points I'm most confident in, and then add a bunch of side points /digressions in the comments, where upvotes/downvotes can help decide whether people end up reading those points.

Edit 2019/12/29: I bounded the timeframe of 2100 to points #1 and #2. I still personally believe the original (unbounded) claims, but I think my explicit evidence is too weak and I don't expect debating meta-level arguments to change many people's minds. I appreciate the pushback (public and private) trying to keep me honest.

Comments48
Sorted by Click to highlight new comments since: Today at 1:27 AM

Myself and Zachary Jacobi did some research for a post that we were going to call "Second-Order Effects Make Climate Change an Existential Threat” back in April 2019. At this point, it's unlikely that our notes will be converted into a post, so I'm going to link a document of our rough notes.

The tl;dr of the doc:

Epistemic status: conjecture stated strongly to open debate.

It seems like there is a robust link between heat and crime (at least 1%/ºC). We should be concerned that increased temperatures due to climate change will lead to increases in conflict that represent an existential threat.

  • We assumed that:
    • Climate change is real and happening (Claim 0).
    • Conflict between humans is a major source of existential risk (Claim 1).
  • Tessa researched whether increased atmospheric CO2 concentrations would make people worse at thinking (Claim 2).
    • She concluded that there is only mixed evidence that CO2 concentrations affect cognition, and only at very high (i.e. indoor) concentrations.
    • If you are concerned about the CO2 → poor cognition → impulsivity/conflict link, worry about funding HVAC systems, not climate change.
  • Zach researched whether heat makes people more violent (Claim 3).
    • They concluded that "This seems to be solidly borne out by a variety of research and relatively uncontroversial, although there is quibbling about which confounders (alcohol, nicer weather) play a role. On the whole, we’re looking at at least 1%/ºC increase in crime. The exact mechanism remains unknown and everything I’ve read seems to have at least one counter-argument against it."
    • The quality of the studies supporting this claim surprised both of us.
  • We did not get around to researching the intersection of food scarcity, climate change, and conflict .

The rough notes represent maybe 4 person-hours of research and discussion; it's a shallow investigation.

Thanks a lot for this. Strongly upvoted.

I don't know if anyone is still planning to research the relationship between (global) warming and increased aggression, but I found a lot of studies that link the two.
It's unclear whether the relationship is linear or curvilinear and it might be that with enough heat the aggression actually decreases.
While the literature is very robust in confirming this link, it strangely doesn't reach a consensus on what causes an increase in temperature to increase aggression.

Some additional notes/clarification/sources for each of the above points:

1. Experts seem (understandably, but rather frustratingly) leery of giving exact death tolls, but here are some examples:

https://www.who.int/news-room/fact-sheets/detail/climate-change-and-health

http://www.impactlab.org/news-insights/valuing-climate-change-mortality/

https://www.youtube.com/watch?v=yXqnKzZiuaE&feature=youtu.be

Note that a lot of expected deaths from those estimates come from exacerbating current neglected tropical diseases like malaria and diarrhea, rather than "direct" climate effects like overheating or droughts.

Note also that 200k-2M puts us in the range of "normal" global health problems like malaria, traffic accidents, etc., rather than making it a uniquely terrifying problem.

2. See prior EA writings by Ozy Brennan:

https://forum.effectivealtruism.org/posts/eJPjSZKyT4tcSGfFk/climate-change-is-in-general-not-an-existential-risk

and John Halstead:

https://docs.google.com/document/d/1qmHh-cshTCMT8LX0Y5wSQm8FMBhaxhQ8OlOeRLkXIF0/edit#

This article has quotes that seem representative of what experts believe:

https://climatefeedback.org/claimreview/earth-is-not-at-risk-of-becoming-a-hothouse-like-venus-as-stephen-hawking-claimed-bbc/

EDIT 2020/1/15: Niel Bowerman estimates the direct risk as real but less than 1/10,000 in the next few centuries:

https://forum.effectivealtruism.org/posts/NLJpMEST6pJhyq99S/notes-could-climate-change-make-earth-uninhabitable-for#2pgNMBikjYTkrGuec

3. Assuming a total, welfarist view about animals, to figure out whether climate change is good or bad for animal well-being, you literally need to have some reasonable estimates of *each* of the following questions:

- Whether climate change will increase or decrease the total biomass of animals in the wild.

- Whether climate change will increase or decrease the proportion of “moral beings with valence” per unit of biomass.

- Whether animals in the wild have net positive lives right now.

- How climate change will affect the average valence of animals in the wild.

People who talk about climate change's impact on wild animal welfare focus on the sharp disequilibria, but I expect it to be relatively small on even a short timescale compared to the (basically unknown) level effects.


4.

https://forum.effectivealtruism.org/posts/GEM7iJnLeMkTMRAaf/updated-global-development-interventions-are-generally-more


5. In addition to the citations above, a) this is my impression from informal discussions with people who I believe know a lot more about this topic than I do, and b) There is the meta-level evidence that EAs who think a lot about cause prioritization usually don't focus on climate change.


6. For example, most (smart, educated) people I talk to are surprised at the balance of increased NTDs as the predominant cause of deaths from climate change, also I learned recently that temperature increases is proportional to log(ppm) rather than linearly, which is really obvious in retrospect but I didn't think about, and I'm willing to bet that 80%+ of STEM college grads wouldn't know.


7. This is a very high-level case. I don't know your life, etc, and if you have an unusually good opportunity to make impact within climate change or if you have detailed models of how climate change affects the world that's very different from my own, you should probably act on your own viewpoints.

In general, I feel like the burden of proof needed to make life decisions based primarily on some stranger on the internet is quite high, and I don't think I have met it.

That said, some random brainstorming:

Care about helping poor people not die from malaria due to climate change -> work on making sure poor people don't die from malaria, period

Care about the long-term future -> explore other long-termist stuff like AI Safety, biorisk, moral circle expansion etc.

care about animal welfare -> factory farming and look into research on wild animal stuff. (Addendum: I think it'd be surprising but not crazy if climate change work is better for animal welfare than work on reducing factory farming, but in worlds where this is true, your priority in 2019 should probably be to study wild animal welfare rather than to assume the connection).

Less obvious stuff:

Generically care about the environment -> Look into indoor and outdoor air pollution. (I'm less confident about this suggestion than the previous 3)


8. There are also likely similar values, like caring about the Global South, animals, and future people. It’s very difficult to communicate to someone that you think their life’s work is not necessarily the best thing to do with limited resources (and most people are less used to this criticism than EAs), and extreme prudence is recommended.

A secondary point is nuance. I think it's bad from both an epistemic and PR perspective if the message will be distorted from “our best understanding of the situation is that mainline climate change mitigation is unlikely to be the marginal best thing to work on for most EA people with flexible career capital” to something more catchy but much less accurate.

It’s very difficult to communicate to someone that you think their life’s work is misguided

Just emphasizing the value of prudence and nuance, I think that this^ is a bad and possibly false way to formulate things. Being the "marginal best thing to work on for most EA people with flexible career capital" is a high bar to scale, that most people are not aiming towards, and work to prevent climate change still seems like a good thing to do if the counterfactual is to do nothing. I'd only be tempted to call work on climate change "misguided" if the person in question believes that the risks from climate change are significantly bigger than they in fact are, and wouldn't be working on climate change if they knew better. While this is true for a lot of people, I (perhaps naively) think that people who've spent their life fighting climate change know a bit more. And indeed, someone who have spent their life fighting climate change probably has career capital that's pretty specialized towards that, so it might be correct for them to keep working on it.

I'm still happy to inform people (with extreme prudence, as noted) that other causes might be better, but I think that "X is super important, possibly even more important than Y" is a better way to do this than "work on Y is misguided, so maybe you want to check out X instead".

Yeah I think that's fair. I think in practice most people who get convinced to work on eg, biorisk or AI Safety issues instead of climate change often do so for neglectedness or personal fit reasons.

Feel free to suggest a different wording on my point above.

EDIT: I changed "misguided"->"necessarily the best thing to do with limited resources"

I also think we have some different interpretations of the connotations of "misguided." Like I probably mean it in a weaker sense than you're taking it as. Eg, I also think selfishness is misguided because closed individualism isn't philosophically sound, and that my younger self was misguided for not being a longtermist.

I think when considering your estimates for 1. it is important to consider the boundaries given by those sources and to contextualise them.

The WHO is only looking at disease burden but even there they are expecting 250k to 2050 (not even looking to 2100) and they estimate that CC will exacerbate malnutrition by 3% of current values - this seems extremely conservative. They don't seem to include the range increases for most other insect-transmitted diseases, just malaria, even within the extremely limited subset of causes they consider.

Impactlab's "big data approach" - they don't give their assumptions, parameters, or considerations - I think this should largely be discounted as a result. It seems to be based on historic and within-trend correlation data, not accounting for risk of any higher-level causes of mortality such as international conflicts, political destabilisation, famine, ecological collapse, climate migration, infrastructure damage etc. that will have an impact and I am guessing aren't accounted for in their correlational databank.

Danny Bressler is only looking at extrapolating inter-personal conflicts. It doesn't include famines, pandemics, increased disease burden, ecosystem collapse, great nation conflicts, etc. etc. etc. that are very likely to be much, much worse than the trends considered in his model. As such his 74 million estimate should be considered an extremely conservative lower bound to the estimated value. He is also showing a significant upwards trend per-year, so the burden should be considered to exacerbate over time.

Overall this seems to cast doubt on 1, 4 and 5. For 2. I have also critiqued John Halstead's work in a previous post, and the Ozy Brennan post is refuting CC as an extinction risk, not as a global catastrophic risk as you use it. He is saying nothing about the chances of >10% likelihood of >10% population decrease. These combined should cause pause for thought when making statement 7.

I think while it's some evidence that he considers his analysis still quite conservative, the most important contextualization of Danny Bresler's analysis, for our very high-level purposes of understanding expert opinion as laymen, is that (iirc) he perceives it to be a new/contrarian position, where most estimates of climate change mortality burden is too low (from his perspective).

I also prefer the framing of things I hyperlink as "links I added that I thought might be helpful for helping to understanding the question further," rather than "sources," but I think that was my own fault for trying to write a short post at the possible expense of accuracy.

This doesn't seem to be the context in which you were dropping the link, seeing as they all have top-level summary numbers that feed into your boundaries and point 1 doesn't say anything about 2m/year being a lower bound, seeing as you are using it as the upper bound. I would like to see these other estimates of climate mortality as they aren't referenced or seem to feed into point 1.

On a meta-level I think disconnecting sources with the context with which you are referencing them is very unfriendly to the reader as they have to wade through your links to find what you are saying where, but apparently these aren't even sources for adding evidence to your claims. So I am further befuddled by your inclusion of incidental contextualising literature when you haven't included references to substantiate your claims. I also think your edit comes across as quite uncharitable to readers (and self-defeating) if you don't think you can change people's minds.

On 1) not being able to read the full text of the impactlab report, but it seem they just model the link between heat and mortality, but not the impact of heat on crop production causing knock on health problems. E.g. http://dels.nas.edu/resources/static-assets/materials-based-on-reports/booklets/warming_world_final.pdf suggests that each degree of warming would reduce the current crop yields by 5-15%. So for 4 degrees warming (baseline according to https://climateactiontracker.org/global/temperatures/ ), this would be 20-60% of world food supply reduction.

If governments stick to their policies (which they have been notoriously bad at so far) then the reduction would only be 10-30%. I'd expect even a 10% decrease to have massive knock on effects to the nutrition and mortality of the world. I expect that is not included in the impact lab report because it is very hard to have papers that encompass the entire scope of the climate crisis.

Of course there could be a lot of changes to how and where we grow crops to avoid these problems, but making sure that we manage this transition well, so that people in the global south can adopt the appropriate crops for whatever their climate becomes seems like something that could use some detailed analysis. It seems neglected as far as I can tell, there may be simple things we can do to help. It is not mainstream climate change mitigation though, so might fit your bill?



You'd need to think there was a very significant failure of markets to assume that food supplies wouldn't be adapted quickly enough to minimize this impact. That's not impossible, but you don't need central management to get people to adapt - this isn't a sudden change that we need to prep for, it's a gradual shift. That's not to say there aren't smart things that could significantly help, but there are plenty of people thinking about this, so I don't see it as neglected of likely to be high-impact.

I'm expecting the richer nations to adapt more easily, So I'm expecting a swing away from food production in the less rich nations as poorer farmers would have a harder time adapting as there farms get less productive (and they have less food to sell). Also farmers with now unproductive land would struggle to buy food on the open market

I'd be happy to be pointed to the people thinking about this and planning on having funding for solving this problem. Who are the people that will be funding the teaching of subsistence rice farmers (of all nationalities) how to farm different crops they are not used to etc? Providing tools and processing equipment for the new crop. Most people interested in climate change I have met are still in the hopeful mitigation phase and if they are thinking about adaptation it is about their own localities.

This might not be a pressing problem now[1], but it could be worth having charities learning in the space about how to do it well (or how to help with migration if land becomes uninhabitable).

[1] https://blogs.ei.columbia.edu/2018/07/25/climate-change-food-agriculture/ suggests that some rice producing regions might have problems soon

The way climate scientists use those terms, I think of safeguarding soil quality and genetically engineering or otherwise modifying new crops for the heat as more of climate change adaption than mainstream mitigation problem.

Tony Allan who I quoted in a different comment also believed that there are a bunch of other ecological problems with the future of our current soil quality. This does seem important?

I don't know nearly enough about the field to have any opinions on tractability or neglectedness (David Manheim who commented below seems to know more).

That said, I personally would be quite surprised if worldwide crop yields actually ended up decreasing by 10-30%. (Not an informed opinion, just vague intuitions about econ).


That said, I personally would be quite surprised if worldwide crop yields actually ended up decreasing by 10-30%. (Not an informed opinion, just vague intuitions about econ).

I hope they won't too, if we manage to develop the changes we need to make before we need them. Economics isn't magic

But I wanted to point out that there will probably be costs associated with stopping deaths associated with food shortages with adaptation. Are they bigger or smaller than mitigation by reducing CO2 output or geoengineering?

This case hasn't been made either way to my knowledge and could help allocate resources effectively.

I found this report on adaptation, which suggest adaptation with some forethought will be better than waiting for problems to get worse. Talks about things other than crops too. The headlines

  • Without adaptation, climate change may depress growth in global agriculture yields up to 30 percent by 2050. The 500 million small farms around the world will be most affected.
  • The number of people who may lack sufficient water, at least one month per year, will soar from 3.6 billion today to more than 5 billion by 2050.
  • Rising seas and greater storm surges could force hundreds of millions of people in coastal cities from their homes, with a total cost to coastal urban areas of more than $1 trillion each year by 2050.
  • Climate change could push more than 100 million people within developing countries below the poverty line by 2030. The costs of climate change on people and the economy are clear. The toll on human life is irrefutable. The question is how will the world respond: Will we delay and pay more or plan ahead and prosper?

For #6, what is your source that temperature increases are proportional to log(CO2 ppm)? This paper indicates that it's a simple proportional relationship, no log: https://iopscience.iop.org/article/10.1088/1748-9326/11/5/055006#erlaa23b8f1

(Caveat here is that I understand much less climate science than I would like, and there are gaps in my knowledge that someone who recently took a few undergrad classes on climate science can fill).

https://www.ipcc-data.org/guidelines/pages/reporting.html says it's logarithmic.

I think this is widely known in the field, for example see here: https://en.wikipedia.org/wiki/Climate_sensitivity#Equilibrium_climate_sensitivity

The equilibrium climate sensitivity (ECS) refers to the equilibrium change in global mean near-surface air temperature that would result from a sustained doubling of the atmospheric equivalent CO2 concentration (ΔT2×).

I think the best way to make sense of that is to think of temperature as proportional to log concentration.

(My actual original source is neither, but from an extended discussion with someone else who's much more knowledgeable about climate science than I am, but also not a real expert)

I skimmed your linked article and I don't really understand the discrepancy. I could think of some possible reasons (eg, there's the trivial sense in which all differentiable functions are locally linear) but I'm not confident in them so I'll sleep on this and see if maybe someone else could comment on it in the meantime.

In the current regime (i.e. for increases of less than ~4 degrees C), warming is roughly linear with cumulative carbon emissions (which is different from CO2 concentrations). Atmospheric forcing (the net energy flux at the top of the atmosphere due to changes in CO2 concentrations) is roughly logarithmic with CO2 concentrations.

How temperatures will change with cumulative carbon emissions at temperatures exceeding ~4 degrees C above pre-industrial is unknown, but will probably be somewhere between super-linear and logarithmic depending on what sorts of feedback mechanisms we end up seeing. I discuss this briefly in at this point in this talk: https://youtu.be/xsQgDwXmsyg?t=520

Thanks for this. I think it's valuable when well-informed EAs make easily interpretable claims about difficult questions (another such question is AI risk). This post (including the "appendices" in the comments) strikes a good balance; it is epistemically responsible, yet has clear conclusions.



Thanks for the encouragement!

My best guess is that no similar easily interpretable high-level conclusions exists in AI Safety, with a similar degree of confidence and consensus.

I agree that consensus is unlikely regarding AI safety but I rather meant that it's useful when individuals make clear claims about difficult questions, and that's possible whether others agree with them or not. In AI Impacts' interview series, such claims are made (e.g. here: https://aiimpacts.org/conversation-with-adam-gleave/).

Got it, yeah I agree that's really valuable.

Here for easy disagreement:

(Great idea. But I think this would work better if you had the top comment be just "Here for easy disagreement:" then had the sub comments be the ranges, so that the top comment could be upvoted for visibility.)

Edit: In case this isn't clear, the parents was changed. Much better!


Great idea! Since the top-level comment doesn't have any upvotes (yay! Most people don't think this post is completely bonkers. :P), I'll just edit it to include your suggestion.

Upvote this comment if you disagree with 1-2 of the 8 above points.

Upvote this comment if you disagree with 3-6 of the 8 above points.

Upvote this comment if you disagree with 7/8 or 8/8 of the above points.

By the way, while I still stand by the core claims in the post, covid-19 has made me update to be more worried about tail risks from extreme climate change (though I continue to think it's very overrated in the general public). A lot of the reason I didn't (and mostly still don't) think climate is a large direct existential or global catastrophic risk is that if I dive into the numbers/direct statements from climate scientists, the closer you go to the actual modeling the more sanguine people are.

However, I've since gotten more pessimistic both about how good people in general are at extreme probabilities and which biases they have. Specifically, naively I'd have guessed that climate scientists are biased to believe their problem is more important, but now I think there are larger effects from respectability + selection effects from less alarmist and more certain papers getting funding.

To a lesser extent, I've updated moderately towards civilizational inadequacy being common even in the face of slow-moving and large problems. I don't want to update too strongly because of a) issues of over-indexing on one example and b) I suspect that as much as I want to guard against biases in this direction, in practice I think there's a high chance I would be counterfactually more cavalier about government adequacy if I lived in China, Mongolia, or New Zealand.

I don't want the verbal arguments to sway people too strongly, you can see my comments on Metaculus for more probabilistic reasoning.

Having looked at your sources I am not sure they justify the conclusions.


In particular:

  • Your sources for point 1 seem to ignore the >10% case that the world warms significantly more than expected (they generally look at mortality in the business as usual case).
  • Your sources for point 2 focus on whether climate change is truly existential, but do seem to point to a possibly if it being a global catastrophe. (Point 2 appears to be somewhat crucial, the other points, especially 1, 4, 5, 7 depend on this point.)

    It seems plausible from looking at your sources that there are tail risks of extreme warming that could lead to huge global catastrophe (maybe not quite at your cut-off the 10% chance of 10% mortality level but huge).

    Eg Halstead:
    "On current pledges and promises, we’ll probably end up at around 700ppm by 2100 and increasing well beyond that."
    "at 700ppm, ... there is an 11% chance of an eventual >6 degrees of warming"
    "at 1120ppm, there is between a 10% and 34% chance of >9 degrees of warming"
    "Heat stress ... seems like it would be a serious problem for warming >6 degrees for large portions of the planet ... With 11–12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed"
    "6 degrees would drastically change the face of the globe, with multi-metre sea level rises, massive coastal flooding, and the uninhabitability of the tropics." "10 degrees ... would be extremely bad"

Overall I expect these points 1 and 2 are quite possibly correct, but, having looked through your sources and concluded that they do not justify the points very well, I would have low epistemic status in these points.


Also on points 4 and 7, I think they are dependent on what kind of skills and power you have and are using. Eg: If you are long-term focused and have political influence climate issues might be a better thing to focus on than AI safety risks which is not really on the political agenda much.

Climate change by itself should not be considered a global catastrophic risk (>10% chance of causing >10% of human mortality)

I'm not sure if any natural class of events could be considered global catastrophic risks under this definition, except possibly all kinds of wars and AI. It seems pretty weird to not classify e.g. asteroids or nuclear war as global catastrophic risks, just because they're relatively unlikely. Or is the 10% supposed to mean that there's a 10% probability of >10% of humans dying conditioned on some event in the event class happening? If so, this seems unfair to climate change, since it's so much more likely than the other risks (indeed, it's already happening). Under this definition, I think we could call extreme climate change a global catastrophic risk, for some non-ridiculous definition of extreme.

The other fairly plausible GCR that is discussed is biological. Black death likely killed 20% of the population (excluding the Americas, but not China or Africa, which we affected) in the middle ages. Many think that bioengineered pathogens or other threats could plausibly have similar effects now. Supervolcanos and asteroids are also on the list of potential GCRs, but we have better ideas about their frequency / probability.

Of course, Toby's book will discuss all of this - and it's coming out soon!

Ben
4y10
0
0

Hey there, interesting article! In this talk from the most recent EA Global, Niel Bowerman (climate physics PhD and now AI specialist at 80,000 Hours) gives some thoughts on the relationship between climate change and existential risk. Essentially I think that there's some evidence about point 2 on your list.

  • In his talk, Niel argues that climate change could cause human extinction in itself, under some scenarios. These are quite unlikely, but have non-zero probabilities. When we consider that emissions are likely to increase well beyond 2100, beware the 2100 fallacy of cutting shorts impact analyses at an arbitrary point in time.
  • The larger contributions very roughly are probably from climate change contributing to social collapse and conflict, which themselves lead to existential risks. Toby Ord has called this an 'existential risk factor'. I think the question isn't "Is climate an existential risk?" but "Does climate change contribute to existential risk?" in which case, it seems that the sign might be yes. Or perhaps "Is climate change important in the long-term?" in which case, if we're thinking across multiple centuries, even with lots of technological development, if we're looking at >6C in 2300 (to pick an example), then I think the answer is yes.
  • All of this being said, I still think it's a fair to argue that AI, bio, and nuclear are more neglected and tractable relative to climate change.

What do you think of Niel's talk and this framing?

(Commenting before watching the video) I think you're actually understating how much Niel knows:

During the 2008 presidential election Niel was a member of President Obama’s Energy and Environment Policy Team. Niel sat on the Executive Committee of the G8 Research Group: UK, was the Executive Director of Climatico, and was Climate Science Advisor to the Office of the President of the Maldives.

From https://www.fhi.ox.ac.uk/team/niel-bowerman/

I also recall him doing some climate stuff for the UN, but it wasn't listed there. In light of that, I think I'm pretty comfortable just outside-viewing and differing to him on model uncertainty and extreme low-probability outcomes.

Or perhaps "Is climate change important in the long-term?" in which case, if we're thinking across multiple centuries, even with lots of technological development, if we're looking at >6C in 2300 (to pick an example), then I think the answer is yes.

I don't know, I think even if temperatures rise by 10C, humanity will still survive, but yes, then climate change will be very very important.

Re timing: I picked 2100 because that's where most forecasts I see are, but I think it's defensible because I have an inside-view that things will most likely get crazy for other reasons by then. I think this is outside-view defensible because world GDP has doubled every ~20 years since 1950, so very naively we might expect to see a world that's 16x(!) richer by 2100. Even if we don't speculate on specific technologies, it seems hard to imagine a 16x richer world that isn't meaningfully different from our own in hard to predict and hard to plan for ways.

All of this being said, I still think it's a fair to argue that AI, bio, and nuclear are more neglected and tractable relative to climate change.

I agree about neglectedness. I'm agnostic towards whether climate change mitigation is more tractable than biosecurity or AGI safety.

I asked Niel Bowerman, who did substantial work on climate change before and now works at 80000 Hours, for feedback. Here's what he said:

1. I agree that climate change is a smaller contributor to x-risk than most other things that EAs focus on as x-risks.
2. What really matters is how much climate change increases other x-risks, e.g. how much does climate change increase x-risk due to great power war, etc.
3. Why are you defining GCRs as needing to be >10% probability. No-one really has any idea how likely these things are, and I haven't seen a definition with that as part of it, so it seems like an odd addition which you only seem to have added in order to be able to make your point.

On Facebook, a couple of people have asked me on the existential/global catastrophic risk posed by climate change causing or exacerbating widespread (nuclear) war.

Here's what I wrote.

(Note that this is what I personally believe, rather than something I'm confident experts on international relations will agree on)

I think climate change as an indirect GCR is less crazy than climate change as a direct GCR. But I still don't find it very compelling.

I think if you look at the entire history of human armed conflict, deaths attributable to wars are substantially lower than the GCR definition (population adjusted). Also most deaths historically have been incidental/civilian casualties, which have gone down over time for various reasons, including better medical sanitation and general wealth (so less likely that, eg. war -> pillage -> mass starvation).

So to argue that conflict from climate change leads to a GCR, you need a strong reason that "this time is different." One possible reason is access to nuclear weapons, but for this to be true, you need a compelling reason for people to differentially use nuclear weapons as a result of climate change at a fairly high probability. (There are reasons that also point in the other direction).

For food/water wars specifically, I'm reasonably convinced by Amartya Sen's research that pretty much all famines in recorded history are a result of distribution rather than production (his exact claim is that "no democracy has ever had a serious famine," which I think is too strong, but I think the general trend is correct).

In addition, the evidence for wars over food/resource contention is a lot weaker than I would have naively guessed before looking into it briefly. For example, here's an account from a science journalist who was commissioned to write a book about "water wars" and finding that there isn't enough credible evidence to write one:

https://www.nature.com/articles/458282a

So to recap,
1) I think it's relatively implausible that the relatively small food shortages from climate change will result in mass famines.

2) The famine -> war connection is also quite tenuous.

3) The war -> nuclear war connection isn't too strong either.

I agree overall. The best case I've heard for Climate Change as an indirect GCR, which seems unlikely but not at all implausible, is not about direct food shortages, but rather the following scenario:

Assume state use of geoengineering to provide cloud cover, reduce heat locally, or create rain. Once this is started, they will quickly depend on it as a way to mitigate climate change, and the population will near-universally demand that it continue. Given the complexity and global nature of weather, however, this is almost certain to create non-trivial effects on other countries. If this starts causing crop failures or deadly heat waves in the affected countries, they would feel justified escalating this to war, regardless of who would be involved - such conflicts could easily involve many parties. In such a case, in a war between nuclear powers, there is little reason to think they would be willing to stop a non-nuclear options.

[I broadly agree with above comment and OP]

Something I find missing from the discussion of CC as an indirect existential risk is what this means for prioritisation. It's often used implicitly to support CC-mitigation as a high-priority intervention. But in the case of geoengineering, funding for governance/safety is probably on the order of millions (at most), making it many orders of magnitude more neglected than CC-mitigation, and this is similar for targeted nuclear risk mitigation, reducing risk of great power war, etc.

This suggests that donors who believe there is substantial indirect existential risk from CC are (all else equal) much better off funding the terminal risks, insofar as there are promising interventions that are substantially more underfunded.

Are there any states that have committed to doing geoengineering, or even experimenting with geoengineering, if mitigation fails?

Having some publicly stated sufficient strategy would convince me that this was not a neglected area.

Current investment in solar geoengineering is roughly 10 million annually (this may have increased in the last few years), so by most metrics it's really neglected. The main project working on this is the Harvard solar geoengineering research program, which OPP has funded about 2.5 million dollars for a few years in 2016. They've also funded a solar governance program in 2017 for about 2 million dollars. Grants here. Recently, they don't appear to have made any climate-related grants in this space, and its unclear to me what the funding situation looks like.

Regarding states that have committed to doing things: I didn't find anything on doing a shallow dive. However, there is work being done on it in a number of countries. In particular, scientists in India have raised concerns that current models show potential for drought and famine because of reduced water flow in tropical regions like India. (This is in addition to technical and general governance concerns)

From what I understand, Geoengineering is mostly avoided because people claim (incorrectly, in my view) it is a signal that the country thinks there is no chance to fix the problem by limiting emissions. In addition, people worry that it has lots of complex impacts we don't understand. As we understand the impacts better, it becomes more viable - and more worrisome. And as it becomes clearer over the next 20-30 years that a lot of the impacts are severe, it becomes more likely to be tried.

Yeah that seems plausible, though one thing I'd flag is that while I could sort of see why local use of geoengineering is more valuable in worlds with more climate change than less, the difference doesn't intuitively seem that big to me.

(It does however suggest that maybe EAs should be careful about recommending geo-engineering as a solution to climate change? Not sure.)

Given the complexity and global nature of weather, however, this is almost certain to create non-trivial effects on other countries.

...And even if it could miraculously be prevented from actually causing any local negative weather events in other countries, it would certainly be perceived to do so, because terrible freak droughts/floods/etc. will continue to happen as always, and people will go looking for someone to blame, and the geoengineering project next door will be an obvious scapegoat.

Like how the US government once tried to use cloud-seeding (silver iodide) to weaken hurricanes, and then one time a hurricane seemed to turn sharply and hit Georgia right after being seeded, and everyone blamed the cloud-seeding, and sued, and shut the program down, ...even though it was actually a coincidence! (details) (NB: I only skimmed the wikipedia article, I haven't checked anything)

Re: "water wars". That article is from 2009. Since then there has been Syria.

Hi. I emailed Tony Allan, the social scientist quoted in the nature op-ed I linked above, about this question:

I have your enquiry about the link between water scarcity and the present armed conflict in Syria.
My position is the same as it was when I concluded, as did others, in the 1980s that armed conflict could take place between farmers and between villages that shared a water resources for irrigation.  But nations have not gone to war over water. The latter outcome was and remains a consequence of the willingness of a few economies - well endowed with water resources - USA, Canada, Brazil, Argentina and Australia - to export food at prices which have not reflected all the production costs, nor any of the costs of water, nor the costs of damaging their water ecosystems, biodiversity and the atmosphere.  Importing food is a no brainer for water scarce economies. They benefit from underpriced food and the exporters give them their environment for nothing. It is an amazing example of willing self harm - in an extreme form in the case of the United States..
The current Syrian crisis is a consequence of not being able to organise the reliable import and distribution of underpriced food. 
I attach a paper which introduces a number of ideas about the global food system and its problems. It is an accessible version of the last chapter by myself in the recently published book entitled Water, food and society.  See attachment.
Please get back if you have further questions.
With very best regards

Here's the beginning of the abstract of the linked chapter:

Abstract: Affordable food is a political imperative. There is nothing more expensive: food is grown in a failed market where farmers are price takers, not price setters; they subsidise the rest of us by delivering under-priced food but cannot, at the same time, take good care of the land. The real cost of food is paid by stealing from our children’s future: by running down soil health, biodiversity and water resources; and in emissions of greenhouse gases. And in small part by farm subsidies.

The ecological problems with soil he mentions in his chapter seem to be somewhat related to climate change but climate change doesn't seem to be central to them.

(I skimmed but did not read the paper, if someone's interested in investigating further, ping me and I can forward the email to you).

Have you read this paper suggesting that there is no good evidence of a connection between climate change and the Syrian war? I found it quite persuasive.

Just flagging that I posted this comment (the parent) from the wrong account (EA Hotel), should've been from this one! [mods, I don't suppose there is any way of correcting this?]

Weird caveat:

The ranges given assume that Transformative Artificial Intelligence and other major transformative technologies are not invented by 2100. I actually personally think it's more likely than not that other things will happen first, but a) none of the expert analysis I've read or skimmed assumed "crazy stuff" and b) so far I'm not aware of any prominent claims that climate change will be likely to be differentially more important in worlds with other major transformative technologies!

So unless explicitly stated otherwise, assume I'm using "likely" in a weird way to implicitly include "conditional on not the singularity."