Raphaël S

67Joined Jun 2021

Comments
12

Why does no one care about AI?

Babbling without any pruning:

- It is very difficult to imagine such an AI
- Currently, computers and AI in video games are stupid
- the collective representations are such as Ex Machina or Terminator reinforce the idea that it is only fiction
- understanding  the orthogonality thesis requires a fine-grained epistemology to dissociate two concepts often linked in practice in everyday life
- Loss of status for academics
- It is possible that it is really too complicated for an AI of just higher level than a human-level to design a nanoconductor factory. In order to understand the risks, we have to take a step back on history, understand what an exponential curve is, and tell ourselves that a superintelligence will arrive later on

The Possibility of Microorganism Suffering

I don't have the time to elaborate, but I find this post compelling

Longtermists Should Work on AI - There is No "AI Neutral" Scenario

I think in the EA community, the bottleneck is the supply of AI safety related jobs/projects, but there is already a very strong desire to move into AI safety. The problem is not  longtermists who are already working on something else. They should generally continue to do so, because the portfolio argument is compelling. The problem is the bootstrapping problem for people who want to start working an AI safety

Even if you only value AI safety, having a good portfolio community is important and makes our community attractive. Ai safety is still weird. FTX was originally only vegan, and only then shifted to long term considerations. That's the trajectory of most people here. Being diverse is at least cool for that reason.

Longtermists Should Work on AI - There is No "AI Neutral" Scenario

According to you, what should be the proportion of longtermists who should work on AI?

Why EAs are skeptical about AI Safety

I can organize a session with my AI safety novice group to build the kialo

Why EAs are skeptical about AI Safety

We could use kialo, a web app, to map those points and their counterarguments

Some unfun lessons I learned as a junior grantmaker

I think the elephant in the room is : "Why are they part-time?"

If making more grants is so important, either hire more people or work full-time, no? This is something I do not understand with the current status quo

Hypertension is Extremely Important, Tractable, and Neglected

How did you discover this cause area? Is there a way to automatically browse all diseases and associated daly and see the research effort associated with each disease?

The Case for Non-Technical AI Safety/Alignment Growth & Funding

I have the impression that one of the reasons for the focus on technical AI is the fact that once you succeed in aligning an AI, you expect it to perform a pivotal act, e.g. burn out all the gpus on earth. To achieve this pivotal act, it seems that going through AI governance is not really necessary?

But yes, it does seem to be a bit of a stretch

The Case for Non-Technical AI Safety/Alignment Growth & Funding

Great first post!

Do we have statistics on the number of people and organizations in AI technical safety and people in AI governance?

Load More