Thanks for making this, it looks great! Visualizations like this are great for explaining the importance of x-risk and GCR mitigation efforts, providing an intuitive way of understanding the associated probabilities. One recommendation would be to be to make the non-selected paths more transparent when 'survival' or 'extinction' is selected this would make the different cases more obvious. In terms of what other names might be. The image is round so words that come to mind when I viewed it where barometer or compass. I think both fit in terms of what the visualization is doing, either showing the state of risk like a barometer shows the air pressure, or providing information to (hopefully) steer to better futures similar to a compass.
That is a good point - upon reflection, risk of organizations being removed outweighs the potential benefit, thanks for looking into the terms and conditions.
No longer considered a good idea:Why don't EA orgs just us this as an arbitrage opportunity?Create a spread sheet and share it on the forum EA orgs can add themselves to the spreadsheet if they agree to donate 100 dollars to all other charities on the list at a time x. If 10 sign up they send $1,000 out and receive $2,000 back If 30 sign up they send out $3,000 and receive $6,000 back consuming $90,000 of the total match remaining.At 4:57 pm GMT there is ~194 K left for matching This would also be an interesting case study in game theory, does EA have what it takes?
In response to:
Reasons that civilization might not recover include: ...
Are the reasons mentioned in this section what leads to the estimated reduction in far future potential in Table 1? Or are there other reasons that play into those estimates as well?
The reasons civilization might not recover discussed in the introduction are intended to provide evidence that recovery from civilization collapse is unlikely to occur, it is not an exhaustive list but provides some of the major arguments. The values obtained for table 1 on reduction in far future potential were obtained from a survey of existential risk researchers at Ea global 2018 see methods:
At the Effective Altruism Global 2018 San Francisco conference, with significant representation of people with knowledge of existential risk, a presentation was given and the audience was asked about the 100% loss of industry catastrophes. The questions involved the reduction in far future potential due to the catastrophes with current preparation and if ~$30 million were spent to get prepared. The data from the poll were used directly instead of constructing continuous distributions.
Another way to far future impact is the trauma associated with the catastrophe making future catastrophes more likely, e.g. global totalitarianism (Bostrom & Cirkovic, 2008)
Intuitively I feel that the trauma associated with the catastrophe would make people prioritize GCR mitigation and thereby make future catastrophes less likely. Or is the worry that something like global totalitarianism would happen precisely in the name of GCR mitigation?
The 'trauma' refers flow on societal impacts due to the various challenges faced by humans of a post collapse civilization, although intuitively it would make sense to coordinate the harshness of the new environment would result in small groups simply trying to survive. Humans of this heavily degraded civilization would have to deal with a range of basic challenges (food, water, shelter) and would exist without access to previous technologies making it unlikely they would be able to coordinate on a global scale, to address global catastrophic risks. Surviving the new environment might also favour the development of stable yet repressive social structures that would prevent rebuilding of civilization to previous levels. This could be facilitated by dominant groups having technology of the previous civilization.
there are obviously a lot of unknowns concerning post collapse scenarios.
Figures and tables are now visible. I am unsure why they stopped displaying, apologies for any inconvenience.