There are a couple of sources which I'd recommend taking a look at.
I have been working on an EA aligned resource titled "Working on Climate Change as a Technologist" which I've started sharing with a few folks.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA's position on climate change is (in at least this one respect) correct.
I'd like to extend my previous model to have three steps:
(1) EA downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts).
(2) EA downplays the value of working on climate change (e.g. low neglectedness, low tractability).
(3) EA discourages people from working on climate change in favor of other causes.
I think you are arguing that since lots of people already care about climate change, (3) is a sensible outcome for EA. To put this more explicitly, I think you are supportive of (2), likely due to you perceiving it as being not neglected. As I've stated in this post, I think it is possible to argue that climate change is not actually neglected. In fact you suggested below some good arguments for this:
"Even though lots of work has gone into solving climate change, the problem is so vastly complex and multidimensional that there's still lots of low-hanging fruit left unpicked, so tractability remains high (and scale is large, so the two together are sufficient for impact)."
"Although lots of work has gone into solving climate change, partial solutions aren't very valuable: most of the impact comes from the last few % of solving the problem. Also [for some reason] we can't expect future work to continue at the same rate as current work, so more marginal work now is especially valuable."
"While lots of resources have already gone into solving climate change, the problem is actually getting bigger all the time! So even though neglectedness is low and falling, returns from scale are high and rising, so marginal work on the problem remains valuable."
But let's put that to one side for a moment, because we haven't talked about (1) yet. Even if you think the best conclusion for EA to make is (3), I still think it's important that this conclusion is visibly drawn from the best possible information about the expected impacts of climate change. Sections (1), (4), (5), and (7) in my post speak directly to this point.
I look at the way that EA talks about climate change and I think it misses some important points (particularly see section (4) of my post). These gaps in EA's approach to climate change cause me to have lower trust in EA cause prioritization, and at the more extreme end make me think "EA is the community who don't seem to care as much about climate change - they don't seem to think the impact will be so bad". I think that's a PR issue for EA.
That's fine. Marginal/social cost of carbon is the superior way to think about the problem.
(A) Carbon budgets express an important idea about continued emissions committing us to particular levels of warming. This is particularly important when we are likely to exceed the 1.5C carbon budget in less than 10 years. (B) The 80K Hours problem problem also doesn't mention marginal/social cost of carbon. (C) Social cost of carbon is usually computed from an IAM, a practice which has been described as such:
"IAMs can be misleading – and are inappropriate – as guides for policy, and yet they have been used by the government to estimate the social cost of carbon (SCC) and evaluate tax and abatement policies." [Pindyck, 2017, The Use and Misuse of Models for Climate Policy]
In any case, my estimate of the long-term economic costs of climate change (detailed writeup in Candidate Scoring System: http://bit.ly/ea-css ) aggregates over the various scenarios.
I reviewed your writeup, focusing on pages 47 - 61 and I have some questions/comments.
Short term impacts (page 56):
Long run growth (page 56, 57)
If you think that people will like EA more when they see us addressing on climate change, why don't you highlight all the examples of EAs actually addressing climate change (there are many examples)
Appendix 1 lists many of the more recent EA engagements with climate change. I agree that EA has not ignored climate change. However, the point of this post was to discuss some trends which I have observed in how EA engages with climate change.
In any case, estimating the damages of climate change upon the human economy has already addressed by multiple economic metanalyses. Estimating the short- and medium-term deaths has been done by GWWC.
There are legitimate causes for concern about both of the sources you cite - the first relies on IAMs, GWWC relies on one 2014 WHO publication which has some important limitations.
Climate change being expected to persist for centuries is conditional upon the absence of major geoengineering. But we could quite plausibly see that in the later 21st century or anytime in the 22nd century.
I agree that this is a plausible possibility, but not one which I'd like to have to rely on.
If we can't stay under 1.5C, we might stay under 2.0C, which is not that much worse.
That depends on what you count as "not that much worse". The IPCC SR15 report predicts that hundreds of millions more people will be severely impacted at 2.0C versus 1.5C.
I find this whole genre of post tedious and not very useful. If you think climate change is a good cause area, just write an actual cause prioritization analysis directly comparing it to other cause areas, and show how it's better! If that's beyond your reach, you can take an existing one and tweak it.
[...]
I haven't read those previous posts you've written
[...]
Pick another cause area that's currently highlighted, compare it to climate change, and show how climate change is a more effective cause area.
I already did that: "Review of Climate Cost-Effectiveness Analyses". I would love to get your feedback on that post.
For comparison, on 80K's website right now, AI risk, global priorities research and meta-EA are currently at 26, biosecurity and ending factory farming are at 23, and nuclear security and global health are at 21. So your implicit claim is that, on the margin, climate change is less important than AI and GPR, slightly more important than biosecurity and farmed animal welfare, and much more important than nuclear security and global health (of the bednets and deworming variety). Does that sound right to you? That isn't a gotcha, I am genuinely asking, though I do think some elaboration on the comparisons would be valuable.
That does sound about right to me.
If oodles of smart, motivated young people are super-excited about climate change work, a decent chunk of them will end up doing climate change work. We're proposing that these are the sorts of people who would otherwise make a good fit for EA, so we can assume they're fairly smart and numerate. I'd guess they'd have less impact working on climate change outside EA than within it, but they won't totally waste their time. So if there are lots of these people, then lots of valuable climate change work will be done with or without EA's involvement. Conversely, if there aren't lots of these people (which seems false), the fact that we're alienating (some of) them by not prioritising climate change isn't a big issue.
My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.
Perhaps I have misunderstood your argument, but I think you're saying that (1, 2) don't matter because lots of people already care about climate change, so EA doesn't need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.
My guess is that people should probably say what they believe, which for many EAs (including me) is that climate change work is both far less impactful and far less neglected than other priority cause areas, and that many people interested in having an impact can do far more good elsewhere.
Rather than "many EAs", I would say "some EAs" believe that climate change work is both far less impactful and far less neglected than other priority cause areas.
I am not one of those people. I am currently in the process of shifting my career to work on climate change. Effective Altruism is a Big Tent.
That's a fair point. As per the facebook event description, I was originally asked to discuss two posts:
I ended up proposing that I could write a new post, this post. The event was created with a title of "Is climate change neglected within EA?" and I originally intended to give this post the same title. However, I realized that I really wanted to argue a particular side of this question and so I posted this article under a more appropriate title.
You are correct to call out that I haven't actually offered a balanced argument. Climate change is not ignored by EA. As is clear in Appendix A, there have been quite a few posts about climate change in recent years. The purpose of this post was to draw out some particular trends about how I see climate change being discussed by EA.
Thanks for your comments and for linking to that podcast.
And while you may be right that it's a bit naive to just count all climate-related funding in the world when considering the neglectedness of this issue, I suspect that even if you just considered "useful" climate funding, e.g. advocacy for carbon taxes or funding for clean energy, the total would still dwarf the funding for some of the other major risks.
In my post I am arguing for an output metric rather than an input metric. In my opinion, climate change will stop being a neglected topic when we actually manage to start flattening the emissions curve. Until that actually happens, humanity is on course for a much darker future. Do you disagree? Are you arguing that it is better to focus on an input metric (level of funding) and use that to determine whether an area has "enough" attention?
Thanks for your feedback.
Framing climate change as the default problem, and working on other cause areas as defecting from the co-ordination needed to solve it, impedes the essential work of cause-impartial prioritisation that is fundamental to doing good in a world like ours.
I think it's worth emphasizing that the title of this post is "Climate Change Is Neglected By EA", rather than "Climate Change Is Ignored By EA", or "Climate Change Is the Single Most Important Cause Above All Others". I am strongly in favor of cause-impartial prioritisation.
In "Updated Climate Change Problem Profile" I argued that Climate Change should receive an overall score of 24 rather than 20. That's a fairly modest increase.
This post itself argues that EA is losing potential members by not focusing on climate change. But this claim is in direct tension with claims that climate change is neglected. If there are droves of potential EAs who only want to talk about climate change, then there are droves of new people eager to contribute to the climate change movement. The same can hardly be said for AI safety, wild animal welfare, or (until this year, perhaps) pandemic prevention.
I don't agree with this "direct tension". I'm arguing that (A) Climate Change really is more important than EA often makes it out to be, and that (B) EA would benefit from engaging with people about climate change from an EA perspective. Perhaps as part of this engagement you can encourage them to also consider other causes. However, starting out from an EA position which downplays climate change is both factually wrong and alienating to potential EA community members.
I just wanted to leave a quick note to thank you for writing such a thoughtful and well researched piece on this important topic ❤️