Gideon Futerman

695Joined Dec 2021

Bio

Participation
3

Working on the interaction between Solar Radiation Modification (Solar geoengineering) and X-Risk, GCRs, Civilisational Collapse risks and Negative State risks. Summer Research fellow at CERI. Lead researcher on the RESILIENCER Project (www.resiliencer.org). Author on the Low Environmental Impact SRM Experimentation Report.

How I can help others

Reach out to me if you have questions about SRM/Solar geoengineering

Comments
56

Definitely not 2 orders of magnitude too much.

The book was, in Will's words "a decade of work", with a large number of people helping to write it,  with a moderately large team promoting it (who did an awesome job!). There were a lot of adverts certainly around London for the book, and Will flew around the world to promote the book. I would certainly be hugely surprised if the budget was under $1 million (I know of projects run by undergraduates with budgets over a million!), and to be honest $10 million seems to me in the right ball park. Things just cost a lot of money, and you don't promote a book for free!

Thanks for this reply Rob, and I do think its pretty strange that no one in the know came forward to tell you or 80K even in a professional capacity, but that's not really your fault !

Well I have put an edit in there.

Saying I "can't be bothered to send a one line email": I'm not a journalist and really didn't expect this post to blow up as much as it did. I am literally a 19 year old kid and not sure that Will's team will respond to me if I'm honest. Part of the hope for this post was to get some answers, which in some cases (ie Rob Wiblin- thanks!) i have got, but in others I haven't. 

Hey Rob,

Thanks for your in depth response to this question by the way, its really appreciated and exactly what I was looking for from this post! It is pretty strange that no one reached out to you in a professional capacity to correct this, but that certainly isn't your fault!

I've heard it from a number of people saying it quite casually, so assumed it was correct as it's the only figure I heard banded around and didn't hear opposition to it. Just tried to confirm it and don't see it publicly, so it may be wrong. They may have heard it from Emile, I don’t know. So take it with a hefty pinch of salt then. I don't quite think I have the level of access to just randomly email Will MacAskill unfortunately to confirm it, but if someone could, that would be great. FYI I think it probably would have been a fantastic use of 10 million, which is why I also think its quite plausible

I suppose the problem with that question from my perspective is I don't think "existential risk due to X" really exists, as I explain in the talk. The number of percentage points it raises overall risk by, I would put climate change between <0.01% and 2%, and I would probably put overall risk at between 0.01% to 10% or something. But I'm not sure that I actually have much confidence in many approaches to xrisk quantification (as per Beard et al 2020a), even if it does make quantification easier. Some of the main contributions to risk from climate, but note a number may also be unknown or unidentifiable:

  • Weakening local, regional and global governance -Water and food insecurity -Cascading economic impacts -Conflict -Displacement -Biosphere integrity -Responses increasing systemic risk -Extreme Weather -Latent Risk

Mostly these increase risk by: -Increasing our vulnerability -Multiple stressors coalescing into synchronous failure -The major increase in systemic risk -The responses we take -Cascading effects leading to fast or slow collapse then extinction

  • Acting as a "risk factor"

I obviously think we need more time to flesh out real cruxes but I think our differences are cruxes are probably a few fold:

  • I think I am considerably less confident than you in the capacity of the research we have done thus far  to confidently suggest climate's contribute to existential risk. To some degree,  I think the sort of evidence your happier relying on to make negative claims (ie not a major contributor to existential risk) I am much less happy with doing, as I think they often (and maybe always will) fail to account for plausible major contributors to the complexity of a system. This is both an advantage of the simple approach as Toby lays out earlier, but I'm more skeptical at its usage to make negative rather than positive claims.
  • I think you are looking for much better thought out pathways to catastrophe than I think is appropriate. I see climate acting as something acting to promote serious instability in a large number of aspects of a complex system, which should give us serious reasons to worry. This probably means my priors on climate are higher than yours immediately, as I'm of the impression you don't hold this "risk emerges from an inherently interconnected world" ontology. This is why I've often put our differences down to our ontology and how we view risk in the real world
  • Because of my ontology and epistemology, I think I'm happier to put more credence on things like past precedent (collapses trigger by climate change, mass extinctions etc.), and decently formulated theory (planetary boundaries for GCR (although I recognise their real inherent flaws!), the sort of stuff laid out in Avin et al 2018, whats laid out in Beard et al 2021 and Kemp et al 2022).  I'm also happier to take on board a broader range of evidence, and look more at things like how risk spreads, vulnerabilities/exposures,  feedbacks, responses (and the plausible negatives therin) etc, which I don't find your report convincing deals with, partially because they are really hard to deal with and partially because, particularly for the heavy tails of warming and other factors, there is a very small amount of research as Kemp et al lays out. Correct me if I'm wrong, but you see the world as a bit more understandable than I do, so simpler, quantitative, more rational models are seen as more important to be able to make any positive epistemic claim, and so you would somewhat reject the sort of analysis that I'm citing. 
  • I'm also exceptionally skeptical of your claim that if direct risks are lower than indirect risks are lower; although I would reject the use of that language full stop

I also think its important to note that I make these claims in (mostly)  the context of X-Risk. I think in "normal" scenarios, I would fall much closer to you than to disagreeing with you on a lot of things. But I think I have both a different ontology of existential risk (emerging mostly out of complex systems, so more like whats laid out in Beard et al 2021 and Kemp et al 2022) and perhaps more importantly a  more pessimistic epistemology. As (partially) laid out when I discuss Existential Risk, Creativity and Well Adapted Science in the talk, I think that with Existential Risk negative statements (this won't do this) actually have a  higher evidentiary burden than positive statements of a certain flavour (it is plausible that this could happen). Perhaps this is because my priors of existential risk from most things are pretty low (owing I think in part to my pessimistic epistemology) that it just does take much more evidence to cause me to update downwards than to be like "huh, this could be a contributor to risk actually!"

Does this answer our cruxes? I know this doesn't go into object level aspects of your report, but I think this may do a better job at explaining why we disagree, even when I do think your analysis is top-notch, albeit with a methodology that I disagree with on existential risk.

I also think its important that you know that I'm still not quite sure if I'm using the right language to explain myself here, and that my answer here is why I find your analysis unconvincing, rather than it being wrong. Perhaps as my views evolve I will look back and think differently. Anyway, I really would like to talk to you more about this at some point in the future.

Does this sound right to you?

Hi John, thanks for the comment, I've DM'd you about it. I think it may be easier if we did the discussion in person before putting something out on the forum, as there is probably quite a lot to unpack, so let me know if you would be up for this? 

Honestly not sure why I seem to have become NeoMohists enemy! Like I just posted some questions, sure they were questioning and not the most sympathetic to the leadership, but its hardly enough I think to warrant this. On the otherhand, I am sure NeoMohist is going through a difficult time like many of us, so I sort of get jumping to attack me (I'm sure ai have been similarly unreasonable at times in the past)

Load More