HaydnBelfield

Comments

2020 AI Alignment Literature Review and Charity Comparison

[Disclosure: I work for CSER]

I completely agree that BERI is a great organisation and a good choice. However, I will also just breifly note that FHI, CHAI and CSER (like any academic groups) are always open to receiving donations:

FHI: https://www.fhi.ox.ac.uk/support-fhi/

CSER: https://www.philanthropy.cam.ac.uk/give-to-cambridge/centre-for-the-study-of-existential-risk?table=departmentprojects&id=452 

CHAI: If you wanted to donate to them, here is the relevant web page. Unfortunately it is apparently broken at time of writing - they tell me any donation via credit card can be made by calling the Gift Services Department on 510-643-9789. 

Why those who care about catastrophic and existential risk should care about autonomous weapons

FYI if you dig into AI researchers attitudes in surveys, they hate lethal autonomous weapons and really don't want to work on them. Will dig up reports, but for now check out: https://futureoflife.org/laws-pledge/ 

4 Years Later: President Trump and Global Catastrophic Risk

Thanks Pablo, yes its my view too that Trump was miscalibrated and showed poor decision-making on Ebola and COVID-19, because of his populism and disregard for science and international cooperation.

4 Years Later: President Trump and Global Catastrophic Risk

Thanks Stefan, yes this is my view too: "default view would be that it says little about global trends in levels of authoritarianism". I simply gave a few illustrative examples to underline the wider statistical point, and highlight a few causal mechanisms (e.g. demonstration effect, Bannon's transnational campaigning).

4 Years Later: President Trump and Global Catastrophic Risk

Hi Dale,

Thanks for reading and responding. I certainly tried to review the ways Trump had been better than the worst case scenario: e.g. on nuclear use or bioweapons. Let me respond to a few points you raised (though I think we might continue to disagree!)

Authoritarianism and pandemic response - I'll comment on Pablo and Stefan's comments. However just on social progress, my  point was just 'one of the reasons authoritarianism around the world is bad is it limits social progress' - I didn't make a prediction about how social progress would fare under Trump.

Nuclear use and bioweapons - as I say in the post, there haven't been bioweapons development (that we know of) or nuclear use. However, I don't think its accurate to say this is a 'worry that didn't happen'. My point throughout this post and the last one was that Trump  will/has raised risk.  An increase from a 10% to a 20% chance is a big deal if what we're talking about is a catastrophe, and that an event did not occur does not show that this risk did not increase.

On nuclear proliferation, you said "I am not aware of any of these countries acquiring any nuclear weapons, or even making significant progress", but as I said in this post, North Korea has advanced their nuclear capabilities and Iran resumed uranium enrichment after Trump pulled out of the Iran Deal.

Thanks again, Haydn

4 Years Later: President Trump and Global Catastrophic Risk

Hi Ian, 

Thanks for the update on your predictions! Really interesting points about the political landscape.

On your point 1 + authoritarianism, I agree with lots of your points. I think four years ago a lot of us (including me!) were worried about Trump and personal/presidential undermining of the rule of law/norms/democracy, enabled by the Republicans; when we should have been as worried about a general minoritarian push from McConnell and the rest of the Republicans, enabled by Trump.

On climate change, my intention wasn't to imply stasis/inaction over rolling back - I do agree things have gotten worse, and your examples of the EPA and the Dept of the Interior make that case well.

EA Organization Updates: September 2020

Reading this was so inspiring and cool!

I think we could probably add a $25m pro-Biden ad buy from Dustin Moskovitz&Cari Tuna, and Sam Bankman-Fried.

https://www.vox.com/recode/2020/10/20/21523492/future-forward-super-pac-dustin-moskovitz-silicon-valley

Avoiding Munich's Mistakes: Advice for CEA and Local Groups

[minor, petty, focussing directly on the proposed subject point]

In this discussion, many people have described the subject of the talk as "tort law reform". This risks sounding technocratic or minor.

The actual subject (see video) is a libertarian proposal to replace the entirety of the criminal law systen with a private, corporate system with far fewer limits on torture and constitutional rights. While neglected, this proposal is unimportant (and worse, actively harmful) and completely intractable.

The 17 people who were interested in attending didn't miss out on hearing about the next great cause X.

Avoiding Munich's Mistakes: Advice for CEA and Local Groups

I think I have a different view on the purpose of local group events than Larks. They're not primarily about like exploring the outer edges of knowledge, breaking new intellectual ground, discovering cause x, etc.

They're primarily about attracting people to effective altruism. They're about recruitment, persuasion, raising awareness and interest, starting people on the funnel, deepening engagement etc etc.

So its good not to have a speaker at your event who is going to repel the people you want to attract.

Load More