Hi, I’m Florian. I am enthusiastic about working on large scale problems that require me to learn new skills and extend my knowledge into new fields and subtopics. My main interests are climate change, existential risks, feminism, history, hydrology and food security.
In my personal experience you always get downvotes/disagree votes for even mentioning any problems with gender balance/representation in EA, no matter what your actual point is.
This is just another data point that the existential risk field (like most EA adjacent communities) has a problem when it comes to gender representation. It fits really well with other evidence we have. See, for example Gideon's comment under this post here: https://forum.effectivealtruism.org/posts/QA9qefK7CbzBfRczY/the-25-researchers-who-have-published-the-largest-number-of?commentId=vt36xGasCctMecwgi
While on the other hand there seems to be no evidence for your "men just publish more, but worse papers" hypothesis.
Yeah fair enough. I personally, view the Robock et al. papers as the "let's assume that everything happens according to the absolute worst case" side of things. From this perspective they can be quite helpful in getting an understanding of what might happen. Not in the sense that it is likely, but in the sense of what is even remotely in the cards.
Just a side note. The study you mention as especially rigorous in 1) iii) (https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2017JD027331) was made in Los Alamos Labs, an organization who job it is to make sure that the US has a large and working stockpile of nuclear weapons. It is financed by the US military and therefore has a very clear inventive to talk down the dangers of nuclear winter. Due to this reason this study has been mentioned as not to be trusted by several well connected people in the nuclear space I talked to.
An explanation of why it makes sense to talk down the risk of nuclear winter, if you want to have a working deterrence is describe here: https://www.jhuapl.edu/sites/default/files/2023-05/NuclearWinter-Strategy-Risk-WEB.pdf
What exactly confused you about the code? It only strips down the names and counts them.
That the publications by someone are under counted makes sense, given how TERRA works, as likely not all publications are captured in the first place and probably not all publications were considered existential risk relevant. When I look at Bostrom's papers I see several that I would not count as directly x-risk relevant.
Where exactly did you find the number for Torres? On their own website (https://www.xriskology.com/scholarlystuff) they have listed 15 papers and the list only goes to 2020. Since then Torres published several more papers, so this checks out.
I personally did not exclude any papers. I simply used the existing TERRA database. Interestingly, the database only contains one paper by Whittlestone. Seems like the current key words used by TERRA did not catch Whittlestone's work. So, yes this is an undercount.
Exactly, this only counts the number.
Thanks for the kind words (sometimes feels like those are hard to come by in the forum).
"can't really draw much in the way of conclusions from this data" seems like a really strong claim to me. I would surely agree that this does not tell you everything there is to know about existential risk research and it especially does not tell you anything about x-risk research outside classic academia (like much of the work by Ajeya).
But it is based on the classifications of a lot of people on what they think is part of the field of existential risk studies and therefore I think gives a good proxy on what people in the field think what is part of their field. Also, this is not meant to tell you that this is the ultimate list, but as stated in the beginning of the post, as a way to give people an overview of what is going on.
Finally, I think that this surely tells you something about the participation of women in the field. 1 out of 25 is really, really unlikely to happen by chance.
Just a thought here. I am not sure if you can literally read this as EA being overwhelmingly left, as it depends a lot on your view point and what you define as "left". EA exists both in the US and Europe. Policy positions that are seen as left and especially center left in the US would often be more on the center or center right spectrum in Europe.