John G. Halstead

John Halstead - Research Fellow at the Forethought Foundation. Formerly Head of Applied Research at Founders Pledge and researcher at Centre for Effective Altruism DPhil in political philosophy from Oxford

Topic Contributions

Comments

New 80k problem profile - Climate change

I'm not sure I understand why you don't think the in/direct distinction is useful. 

I have worked on climate risk for many years and I genuinely don't understand how one could think it is in the same ballpark as AI, biorisk or nuclear risk. This is especially true now that the risk of >6 degrees seems to be negligible. If I read about biorisk, I can immediately see the argument for how it could kill more than 50% of the population in the next 10-20 years. With climate change, for all the literature I have read, I just don't understand how one could think that. 

You seem to think the world is extremely sensitive to what the evidence suggests will be agricultural disturbances that we live through all the time: the shocks are  well within the normal range of shocks that we might expect to see in any decade, for instance. This chart shows the variation in the food price index. Between 2004 and 2011, it increased by about 200%. This is much much bigger than any posited effects of climate change that I have seen. One could also draw lots of causal arrows from this to various GCRs. Yet, I don't see many EAs argue for working on whatever were the drivers of these changes in food prices. 

New 80k problem profile - Climate change

I agree it is not where the action is but given that large sections of the public think we are going to die in the next few decades from climate change, it makes lots of sense to discuss it. And, the piece makes a novel contribution on that question, which is an update from previous EA wisdom. 

I took it that the claim in the discussed footnote is that working on climate is not the best way to tackled pandemics, which I think we agree is true. 

I agree that it is a risk factor in the sense that it is socially costly. But so are many things. Inadequate pricing of water is a risk factor.  Sri Lanka's decision to ban chemical fertiliser is a risk factor. Indian nationalism is a risk factor. etc. In general, bad economic policies are risk factors. The question is: is the risk factor big enough to change the priority cause ranking for EAs? I really struggle to see how it is. Like, it is true that perceived climate injustice in South Asia could matter for bioterrorism but this is very very far down the list of levers on biorisk. 

New 80k problem profile - Climate change

(In that case, he said that the post ignores indirect risks, which isn't true.)

On your first point, my claim was "I have never seen anyone argue that the best way to reduce biorisk or AI is to work on climate change". The papers you shared also do not make this argument. I'm not saying that it is conceptually impossible for working on one risk to be the best way to work on another risk. Obviously, it is possible. I am just saying it is not substantively true about climate on the one hand, and AI and bio on the other. To me, it is clearly absurd to hold that the best way to work on these problems is by working on climate change. 

On your second point, I agree that climate change could be a stressor of some conflict risks in the same way that anything that is socially bad can be a stressor of conflict risks. For example, inadequate pricing of water is also a stressor of India-Pakistan conflict risk for the same reason. But this still does not show that it is literally the best possible way to reduce the risk of that conflict. It would be very surprising if it were since there is no evidence in the literature of climate change causing interstate warfare. Also, even the path from India-Pakistan conflict to long-run disaster seems extremely indirect, and permanent collapse or something like that seems extremely unlikely. 

New 80k problem profile - Climate change

I don't think the post ignores indirect risks. It says "For more, including the importance of indirect impacts of climate change, and our climate change career recommendations, see the full profile."

As I understand the argument from indirect risk, the claim is that climate change is a very large and important  stressor of great power war, nuclear war, biorisk and AI. Firstly, I have never seen anyone argue that the best way to reduce biorisk or AI is to work on climate change. 

Secondly, climate change is not an important determinant of Great Power War, according to all theories of Great Power War. The Great Power Wars that EAs most worry about are between US and China and US and Russia. The main posited drivers of these conflicts are one power surpassing the other in geopolitical status (the Thucydides trap); defence agreements made over contested territories like Ukraine and Taiwain; and accidental launches of nuclear weapons due to wrongly perceived first strike. It's hard to see how climate change is an important driver of any of these mechanisms. 

New 80k problem profile - Climate change

I think there is good reason to focus on direct extinction given their audience. As they say at the top of their piece, "Across the world, over half of young people believe that, as a result of climate change, humanity is doomed"

What is your response to the argument that because the direct effects of AI, bio and nuclear war are much larger than the effects of climate change, the indirect effects are also likely much larger? To think that climate change has bigger scale than eg bio, you would have to think that even though climate's direct effects are smaller, its indirect effects are large enough to outweigh the direct effects. But the direct effects of biorisk seem huge. If there is genuinely democratisation of bio WMDs, then you get regular cessation of trade and travel, there would need to be lots of surveillance, would everyone have to live in a biobubble? etc. The indirect effects of climate change that people talk about in the literature stem from agricultural disruption in low income countries leading to increased intrastate conflicts in  low income countries (though the strength/existence of the causal connection is disputed). While these indirect effects are bad, they are orders of magnitude less severe than the indirect effects of biorisk. I think similar comments apply to nuclear war and to AI. 

The papers you have linked to suggest that the main pathway through which climate change might destabilise society is via damaging agriculture. All of the studies I have ever read suggest that the effects of climate change on food production will be outpaced by technological change and that food production will increase. For example, the chart below shows per capita food consumption on different socioeconomic assumptions and on different emissions pathways for 2.5 degrees of warming by 2050 (for reference 2.5 degrees by 2100 is widely now thought to be business as usual). Average per capita food consumption increases relative to today on all socioeconomic pathways considered 

Source: Michiel van Dijk et al., ‘A Meta-Analysis of Projected Global Food Demand and Population at Risk of Hunger for the Period 2010–2050’, Nature Food 2, no. 7 (July 2021): 494–501, https://doi.org/10.1038/s43016-021-00322-9 .

Focus of the IPCC Assessment Reports Has Shifted to Lower Temperatures

I think the shift in temperature focus is almost entirely because of the Paris Agreement. It's pretty natural that they would mention 2 degrees and 1.5 degrees a lot given Paris. Indeed, they had a special report on 1.5 degrees for that reason. I don't think it implies a change in research focus in the main reports since, as we have seen almost all impacts lit assesses the effects of RCP8.5. 

Given that the RCP mentions have been pretty constant (barring RCP6 being mentioned less), I don't really see that there has been any change in research focus. I especially don't think it is true to say that the climate science literature is ignoring impacts of more than 3 degrees: that is just very clear if you dig into the impacts literature on any particular impact. In fact, the impacts literature focuses a lot on 4.3 degrees and assumes that we will have little adaptive capacity to deal with that. 

Focus of the IPCC Assessment Reports Has Shifted to Lower Temperatures

Hiya,  I think the latest IPCC report reflects the literature in that it also focuses on RCP8.5 (i.e. 4 degrees). You have sampled temperature mentions but I think if you has sampled RCP mentions, your main finding would no longer stand.

For example, for the latest IPCC report, pretty much every graph includes the impact of RCP8.5. 

Agriculture

Ocean ecosystems

Coral reef

Shoreline change

Phytoplankton phenology

Marine species richness

Marine biomass

etc

Deferring

i thought this post by Huemer was a nice discussion of deference - https://fakenous.net/?p=550

Focus of the IPCC Assessment Reports Has Shifted to Lower Temperatures

As I mentioned in my comment on your earlier post, I don't think the headline claim here is correct. The  majority of the impacts literature focuses on the impacts of RCP8.5, the highest emissions pathway, which implies 4.3 degrees of warming. Moreover, often, papers use RCP8.5 in combination with Shared socioeconomic pathway 3, a socioeconomic future which has low economic growth especially for the poorest countries. SSP3  is not actually compatible with RCP8.5. For this reason, the impacts literature has been criticised, in my view correctly, for being excessively pessimistic. So, I think the reverse of what you say is correct

New substack on utilitarian ethics: Good Thoughts

these posts are very good. I do feel there is a lack of simple and effective arguments for utilitarianism that get missed even by professional philosophers. Most glaringly, there are clear and to my eyes fatal problems for most stated deontological theories which people just ignore when talking about utilitarianism. Deontology seems much less well-developed than utilitarianism on so many fronts

Load More