Open Philanthropy has spent 828 M 2022-$ in its grantmaking portfolio of global catastrophic risks[1] (GCRs). However, it has not yet published any detailed quantitative models which estimate GCRs (relatedly), which I believe would be important to inform both efforts to mitigate them and cause prioritisation. I am thinking about models like Tom Davidson's, which estimates AI takeoff speeds, but outputting the probability of a given annual loss of population or drop in real gross domestic product.
- ^
According to Open Philanthropy's grants database on 17 February 2024, accounting for the focus areas of "Biosecurity & Pandemic Preparedness", "Forecasting", "Global Catastrophic Risks", "Global Catastrophic Risks Capacity Building", and "Potential Risks from Advanced AI".
Thanks for the comment, Ryan. I agree that report by Joseph Carlsmith is quite detailed. However, I do not think it is sufficiently quantitative. In particular, the probabilities which are multiplied to obtain the chance of an existential catastrophe are directly guessed, as opposed to resulting from detailed modelling (in contrast to the AI takeoff speeds calculated in Tom's report). Joseph was mostly aiming to qualitatively describe the arguments, as opposed to quantifying the risk: