A bunch of posts related to The Precipice
I recently finished Toby Ord's The Precipice, and thought it was an excellent and very important book. I plan to write a bunch of posts that summarise, comment on, or take inspiration from various parts of it. Most are currently very early-stage, but the working titles are below.
- Is there anyone who's already planning to write similar things? I probably won't have time to write all the things I've planned. So if someone else is already likely to pursue ideas similar to some of these, we could potentially collaborate, or I could share my notes and thoughts, let you take that particular topic from there, and allocate my time to other things.
- Defining existential risks and existential catastrophes
- My thoughts on Toby Ord's policy & research recommendations
- Existential security
- Civilizational collapse and recovery: Toby Ord's views and my doubts
- The Terrible Funnel: Estimating odds of each step on the x-risk causal path (this title is especially "working")
- The idea here would be to adapt something like the "Great Filter" or "Drake Equation" reasoning to estimating the probability of existential catastrophe, using how humanity has fared in prior events that passed or could've passed certain "steps" on certain causal chains to catastrophe.
- E.g., even though we've never faced a pandemic involving a bioengineered pathogen, perhaps our experience with how many natural pathogens have moved from each "step" to the next one can inform what would likely happen if we did face a bioengineered pathogen, or if it did get to a pandemic level.
- This idea seems sort of implicit in the Precipice, but isn't really spelled out there. Also, as is probably obvious, I need to do more to organise my thoughts on it myself.
- This may include discussion of how Ord distinguishes natural and anthropogenic risks, and why the standard arguments for an upper bound for natural extinction risks don’t apply to natural pandemics. Or that might be a separate post.
- Developing - but not deploying - drastic backup plans
- “Macrostrategy”: Attempted definitions and related concepts
- This would relate in part to Ord’s concept of “grand strategy for humanity”
- Collection of notes
- A post summarising the ideas of existential risk factors and existential security factors?
- I suspect I won’t end up writing this, but I think someone should. For one thing, it’d be good to have something people can reference/link to that explains that idea (sort of like the role EA Concepts serves).