I am a curious and engaged ECR with experience in both research and project support roles, focused on finding equitable, sustainable and just answers to the global challenges of our time.
I have experience with a range of fields and topics, from Transformation/Socio-technical transitions to the role of technology in warfare and security, from the sustainability and equality challenges of the global food system to the intricacies of space policy.
I currently work for Utrecht University in the Netherlands, however am keen to build something of my own. My ethos is 'to do interesting work, with good people, that helps humanity and the world'.
Thanks for this post, agree with other comments that it's very well written and clear, and on first reading I agree with the core message even if some of the specific points/evidence offered may be debatable (e.g. discussions in the comments re: Lynas). Upvoted!
I want to draw attention to one major issue with the analysis that also permeates the dicsussion about climate change as an x-risk both elsewhere and in the discussion here in the comments.
'Following, e.g., Halstead, it is instructive to split the question of climate change damages into three numbered questions'
'This reduces damages from anthropogenic Earth system change to global warming only, which is far from ideal but sufficient for the purposes of this post.'
The reduction of the climate, environmental and ecological crises to only GHG and warming is a major issue within existing sustainability literature, public discourse and governmental policies. These crises and in particular their risks & risk transmission pathways go far beyond this narrow focus on GHG and warming, and ignoring the other ways in which compound human activity is stressing various life-critical earth systems prevents us from having a clear understanding of how our physical environment relates to x-risk and GCRs.
What are we talking about here?
If we want only to understand whether warming could lead to x-risk, then that is an entirely different question to whether or not current human activity and its sum consequences on the natural world (could) present an x-risk/GCR. I would argue that the former is an interesting question, but it ignores that the crises we're facing are not only limited to warming, but involve unprecedented changes (in speed) and pressures across a number of life-critical systems.
We're at risk here of not seeing the whole board, and thats a big problem when we're talking about how far we should prioritise these crises within EA.
Some of the various other dynamics that are not included in discussions limited to warming include eutrophication & arable land loss, biodiversity loss & ecosystem degredation, fresh water use & scarcity, air pollution.
And that's not including the various social challenges that are fundamentally linked to these crises.
A second point - you've focused on importance here. I would argue that there is a major case to be made for neglectedness too in terms of targeting the most important/potentially impactful 'solutions' or interventions we can make to address these crises.
An example: It's often estimated that around 95% of carbon offsets are avoidance credits, which do nothing to offset scope 1 & 2 emissions. We need that money to flow into removal credits instead, but the initial investment is not there yet to make removal technologies competitive and scalable enough to be sold in the offset market (this is beginning to change, but too slowly).
A second example: Taking some of the IPCC mitigation options as starting points, the shift to bikes and e-bikes & electric light and heavy vehicles have the potential to reduce emissions by 0.2 and 0.8 GtCO2eq/year, compared to 1.1 for energy efficiency improvements and 2.9 for ecosystem restoration, but they recieved 2.9bn and 12bn respectively in VC funding over 24 months compared to just 0.7bn and 0.2bn for energy efficiency and restoration over the same period.
Sustainability is full of misallocated funding and a lack of evidence-driven intervention... it is not guaranteed that socieities at large (governments, publics and markets) will effectively address and solve this challenge, EA could have a role to play here.
A note: I see a lot of interesting discussion about economic modelling in the comments - based on experience in my current role working with investors and transitions experts to create scenarios based on these kinds of models, my impression so far is that they are not comprehensive in accounting for the full scope of expected impacts across socio-ecological-technological systems (something that investors themselves consistently report and are looking to rectify), but I will spend some time reading the originial papers and models before responding to those comments specifically.
Thank you for writing this profile & post. Two queries/thoughts that came to mind:
Hi John,
Many thanks to you & the others in the comments for the insightful discussion. Could you clarify a few points:
If you have any additional resources to back these statements up I would love to read them - thanks!
Two neglected X-risks
I've been going through various listings of x-risks (and GCR's, to account for some uncertainty re: climate and nuclear) by prominent organisations, and after a brief scanning have found that the usual list includes:
No list that I came across (explicitly) includes either:
I would make the case that both of these are potential X-risks, and should be taken seriously as objects of research. There appears to be credible reasons to believe that APM is not currently an urgent risk (see https://forum.effectivealtruism.org/posts/gjEbymta6w8yqNQnE/risks-from-atomically-precise-manufacturing by Michael Aird), but the same cannot be said of SETI/METI. After reading through some longtermist/EA and non-EA community research into SETI/METI risks, it seems to be an open question as to just how probable the risk of finding/being found by hostile ETI might be.
In the face of deep uncertainty around a number of key questions (one of which is the small matter of how likely it is that ETI's exist and their likely proximity to us, another of which is a question of ETI behaviour which we must investigate and estimate based on a sample size of 0), I suggest that this risk be considered alongside the 'big four', both due to its potentially devastating scope and its potentially high likelihood.
We need to spend time fully researching this risk and identifying the most prudent course of action, even if the end result is that we find it can be safely discounted.
Fantastic to see Rethink expanding, and I absolutely love the idea of the Special Projects program!
Regarding the Special Projects Associate position:
I'm curious as to why this is an Associate level position and not titled Coordinator? Would you still be open to candidates that are slightly more experienced yet perhaps not to the extent of the Director position (2-4 years experience or analagous to the Researcher tier per your website)?
Separately, would candidates who apply for the Director position be considered for the Associate opening or other indeed other suitable roles within Rethink?
Thanks for writing this post Fin!
I want to express my support for the 'Space governance research centre' idea. I've published a little on Space Policy/Governance and had some very positive feedback from professionals working within the field (e.g. within ESA, Raytheon, companies engaged in EO and so on) supporting the need for proactive policymaking and governance of space activities.
It seems like a natural area for a research centre/think tank/policy lab aimed at policy research and implementation algined with principles of longtermism & EA. I would also argue that the current context (right before space policy/governance is really picked up - if indeed that's what will happen) is exactly the right moment to start something like this... an org could have an outsized impact by taking advantage of a relatively neglected environment and shaping the field from the ground up.
Would love to talk to you about this - if you're keen then feel free to reach out to me at j.b.p.davies@uu.nl
Love the idea - just writing to add that Futures Studies, participatory futures in particular & future scenario methodologies could be really useful for Longtermist research. Methods in these fields can be highly rigorous (I've been working with some futures experts as part of a project to design 3 visions of the future - which have just finished going through a lengthly stress-testing and crowd-sourcing process to open them up to public reflection and input), especially if the scenario design is approached in a systematised way using a well-developed framework.
I could imagine various projects that aim to create a variety of different desirable visions of the future through participatory methods, identifying core characteristics, pathways towards them, system dynamics and so on to illustrate the value and importance of longtermist governance to get there. Just one idea, but there are plenty of ways to apply this field to EA/Longtermism!
Would love to talk about your idea more as it also chimes with a paper I'm drafting, 'Contesting Longtermism', looking at some of the core tensions within the concept and how these could be opened up to wider input. If you're interested in talking about it, feel free to reach out to me at j.b.p.davies@uu.nl
Sounds fantastic - drop me an email at j.b.p.davies@uu.nl and I would love to set up a meeting. In the meantime I'll dive into EIP's work!
Thank you for writing this - looking forward to diving into the full report this weekend. Congratulations on finishing what must have been a major undertaking!