Ross_Tieman

Topic Contributions

Comments

Make a $100 donation into $200 (or more)

That is a good point - upon reflection, risk of organizations being removed outweighs the potential benefit, thanks for looking into the terms and conditions.

Make a $100 donation into $200 (or more)

No longer considered a good idea:

Why don't EA orgs just us this as an arbitrage opportunity?

Create a spread sheet and share it on the forum 
EA orgs can add themselves to the spreadsheet if they agree to donate 100 dollars to all other charities on the list at a time x. 

If 10 sign up they send $1,000 out and receive $2,000 back 

If 30 sign up they send out $3,000 and receive $6,000 back consuming $90,000 of the total match remaining.

At 4:57 pm GMT there is ~194 K left for matching 

This would also be an interesting case study in game theory, does  EA have what it takes?

AGI safety and losing electricity/industry resilience cost-effectiveness

In response to:

(1) Regarding:
Reasons that civilization might not recover include: ...
Are the reasons mentioned in this section what leads to the estimated reduction in far future potential in Table 1? Or are there other reasons that play into those estimates as well?

The reasons civilization might not recover discussed in the introduction are intended to provide evidence that recovery from civilization collapse is unlikely to occur, it is not an exhaustive list but provides some of the major arguments. The values obtained for table 1 on reduction in far future potential were obtained from a survey of existential risk researchers at Ea global 2018 see methods:

At the Effective Altruism Global 2018 San Francisco conference, with significant representation of people with knowledge of existential risk, a presentation was given and the audience was asked about the 100% loss of industry catastrophes. The questions involved the reduction in far future potential due to the catastrophes with current preparation and if ~$30 million were spent to get prepared. The data from the poll were used directly instead of constructing continuous distributions.

(2) Regarding:
Another way to far future impact is the trauma associated with the catastrophe making future catastrophes more likely, e.g. global totalitarianism (Bostrom & Cirkovic, 2008)
Intuitively I feel that the trauma associated with the catastrophe would make people prioritize GCR mitigation and thereby make future catastrophes less likely. Or is the worry that something like global totalitarianism would happen precisely in the name of GCR mitigation?

The 'trauma' refers flow on societal impacts due to the various challenges faced by humans of a post collapse civilization, although intuitively it would make sense to coordinate the harshness of the new environment would result in small groups simply trying to survive. Humans of this heavily degraded civilization would have to deal with a range of basic challenges (food, water, shelter) and would exist without access to previous technologies making it unlikely they would be able to coordinate on a global scale, to address global catastrophic risks. Surviving the new environment might also favour the development of stable yet repressive social structures that would prevent rebuilding of civilization to previous levels. This could be facilitated by dominant groups having technology of the previous civilization.

there are obviously a lot of unknowns concerning post collapse scenarios.

AGI safety and losing electricity/industry resilience cost-effectiveness

Figures and tables are now visible. I am unsure why they stopped displaying, apologies for any inconvenience.