The Doomsday Clock is the most prominent symbol of existential risk. While it's helped highlight the risks of nuclear war, it's misaligned with EA values in various ways:
- It does not use quantified probabilities. "100 seconds to midnight" has an ordinal relationship to "90 seconds to midnight" but is not itself a prediction of anything.
- It is fundamentally pessimistic. A doomsday clock implies an inevitable countdown. One can turn back the clock, but midnight still looms.
- It focuses on nuclear war and climate change, whereas EAs tend to see more existential risk in AI.
With this in mind, I thought about what effective altruists would make as an alternative symbol if we got to choose the main symbol of existential risk. Then I made it. Check it out and let me know what you think, I'm hoping to promote it further. I also have open questions below.
The X-Risk Tree is a symbol of the branching possibilities facing humanity. Its primary audience are people concerned by global catastrophic risk but who feel unable to do anything about it. I think there's a chunk of people in this situation, especially among environmentalists. The tree is intended to show that we can prune the branches of our future, that we have the agency to choose a path that avoids doom. Ideally it feels like an interactive display at a museum.
The numbers are sourced from Metaculus's Ragnarok series. I believe this is an important step up over the Doomsday Clock's non-quantified predictions. However, there are still issues with this approach, as Linch has pointed out.
Note that alternative predictions from EAs are included on the collections page.
- Would people enjoy being able to input data to generate a tree of their own predictions?
- Would a sharing option for social media (image of tree and text of predictions) be useful?
- I am not totally happy with the title. If you can convince me of a better one, I will provide a $100 bounty.
- What could be high-leverage ways to promote it?
- What could be done on the site to more thoroughly communicate the idea of "existential risk is serious but we can work on it"?
- Is there a better word than 'sustenance' for outcomes where humanity does not suffer a global catastrophe?
This was made possible by a Long-Term Future Fund grant.
Linch's feedback was very helpful.
By The Way!
Shameless plug: If you need a developer, I am currently working for work!