RS

Raphaël S

85 karmaJoined Jun 2021

Comments
16

This will be very valuable to me, thanks !

Here is the curriculum of the ML4Good, an AGI safety camp organized by EffiSciences to  tprosaic alignment researchers.

The program contains many programming exercises

Babbling without any pruning:

- It is very difficult to imagine such an AI
- Currently, computers and AI in video games are stupid
- the collective representations are such as Ex Machina or Terminator reinforce the idea that it is only fiction
- understanding  the orthogonality thesis requires a fine-grained epistemology to dissociate two concepts often linked in practice in everyday life
- Loss of status for academics
- It is possible that it is really too complicated for an AI of just higher level than a human-level to design a nanoconductor factory. In order to understand the risks, we have to take a step back on history, understand what an exponential curve is, and tell ourselves that a superintelligence will arrive later on

I don't have the time to elaborate, but I find this post compelling

I think in the EA community, the bottleneck is the supply of AI safety related jobs/projects, but there is already a very strong desire to move into AI safety. The problem is not  longtermists who are already working on something else. They should generally continue to do so, because the portfolio argument is compelling. The problem is the bootstrapping problem for people who want to start working an AI safety

Even if you only value AI safety, having a good portfolio community is important and makes our community attractive. Ai safety is still weird. FTX was originally only vegan, and only then shifted to long term considerations. That's the trajectory of most people here. Being diverse is at least cool for that reason.

According to you, what should be the proportion of longtermists who should work on AI?

I can organize a session with my AI safety novice group to build the kialo

We could use kialo, a web app, to map those points and their counterarguments

Load more