Hide table of contents

What is the topic of the talk?
Who would you like to give the talk?
What is the format of the talk?
Why is it important?

7

0
0

Reactions

0
0
New Answer
New Comment

4 Answers sorted by

AI risk for beginners/dummies. I know almost nothing about it, and my guess is I'm not alone. 

Does anyone know who would be good for this talk? I don't.

7
Jon P
2y
I think Rob Miles youtube channel is a good resource for beginners, he's got a lot of nice videos there and he is a good speaker.
2
Nathan Young
2y
Hey Sandy, could you edit your answer and put Rob as a suggested speaker?

I would like to see workshops targeted at people at all different stages of the pipeline (although my expectation is that everyone at EAG would at least know the super basics of what AI risk is and why we might care about it).

So for example you could design a program looking like the following:

  • How should you prioritise AI Safety? - A workshop designed to help you figure out how important you should consider it as a cause area and whether you should personally focus on it
  • So you want to work on AI Safety - A talk for people who have decided to work on AI safety to find out about the opportunites in this space
  • A deep dive events for people already focusing on AI safety to engage with each other on issues of specific importance

Obviously, you could replace these with different events, but the point is to cover all bases.

I prefer if these were three seperate comments so I could upvote them seperately.

2
Chris Leong
2y
It's one unified idea though and the idea without the examples would be unclear.

What is the topic of the talk?

Suffering risks, also known as S-risks

Who would you like to give the talk?

Possible speakers could be Brian Tomasik, Tobias Baumann, Magnus Vinding, Daniel Kokotajlo, or Jesse Cliton, among others.

What is the format of the talk?

The speaker would discuss some of the different scenarios in which astronomical suffering on a cosmic scale could emerge, such as risks from malevolent actors, a near-miss in AI alignment, and suffering-spreading space colonization. They would then discuss possible strategies for reducing S-risks, and some of the open questions related to S-risks and how to prevent them.

Why is it important?

So that worse that death scenarios can be avoided if possible.

Explain AI risk - Rob Bensinger / Andrew Ngo/ Neel Nanda - Workshop

Split people into pairs and get them to explain AI risk to one another. Then get the other person to explain it back. Then give tips on how the explanation could be simpler. Use slido to take comments on what most people found difficult. Then the speakers answer those. Then try again with a new pair. How do you feel?

Curated and popular this week
Relevant opportunities