CL

Chris Leong

Organiser @ AI Safety Australia and NZ
7549 karmaJoined Sydney NSW, Australia

Bio

Participation
7

Currently doing local AI safety Movement Building in Australia and NZ.

Sequences
1

Wise AI Wednesdays

Comments
1285

Topic contributions
2

Very excited to read this post. I strongly agree with both the concrete direction and with the importance of making EA more intellectually vibrant.

Then again, I'm rather biased since I made a similar argument a few years back.

Main differences:

  • I suggested that it might make sense for virtual programs to create a new course rather than just changing the intro fellowship content. My current intuition is that splitting the intro fellowship would likely be the best option for now. Some people will get really annoyed if the course focuses too much on AI, whilst others will get annoyed if the course focuses too much on questions that would likely become redundant in a world where we expect capability advances to continue. My intuition is that things aren't at the stage where it'd make sense for the intro fellowship to do a complete AGI pivot, so that's why I'm suggesting a split. Both courses should probably still give participants a taste of the other.
  • I put more emphasis on the possibility that AI might be useful for addressing global poverty and that it intersects with animal rights, whilst perhaps Will might see this as too incrementalist (?).
  • Whilst I also suggested that putting more emphasis on the implications of advanced AI might make EA less intellectually stagnant, I also noted that perhaps it'd be better for EA to adopt a yearly theme and simply make the rise of AI the first. I still like the yearly theme idea, but the odds and legibility of AI being really important have increased enough that I'm now feeling a lot more confident as identifying AI as an area that deserves more than just a yearly theme.

I also agree with the "fuck PR" stance (my words, not Will's). Especially insofar as the AIS movement has greater pressure to focus on PR, since it's further towards the pointy end, I think it's important for the EA movement to use its freedom to provide a counter-balance to this.

I would like to suggest that folk not downvote this post below zero. I'm generally in favour of allowing people to defend themselves, unless their response is clearly in bad faith. I'm sure many folk strongly disagree with the OP's desired social norms, but this is different from bad faith.

Additionally, I suspect most of us have very little insight into how community health operates and this post provides some much needed visibility. Regardless of whether you think their response was just right, too harsh or too lenient, this post opens up a rare opportunity for the community to weigh in.

I suspect people are downvoting this post either because they think the author is a bad person or they don't want the author at EA events. I would suggest that neither of these are good reasons to downvote this specific post into the negative.

Sure, but these orgs found their own niche.

HIP and Successif focuse more on mid-career professionals.

Probably Good focusing on a broader set of cause areas; and taking some of the old responsibilities of 80k when it started focusing on more on transformative AI. 

Oh, I think AI safety is very important; short-term AI safety too though not quite 2027 😂.

Knock-off MATS could produce a good amount of value, I just want the EA hotel to be even more ambitious.

Chris Leong
*2
0
0
50% disagree

Should our EA residential program prioritize structured programming or open-ended residencies?


There's more information value in exploring structured programming.

That said, I'd be wary duplicating existing programs; ie. if the AI Safety Fellowship became a knock-off MATS.

What the School of Moral Ambition has achieved is impressive, but it's unclear whether EA should aim for mainstream appeal insofar as SoMA could potentially fill that niche.

"~70% male and ~75% white" — I'm increasingly feel that the way to be cool is to not be so self-conscious about this kind of stuff. Would it be great to have more women on our team? Of course! And for EA to be more global? Again, that'd be great! But talking about your demographics like it's a failure will never be cool. Instead EA should just back itself. Are our demographics ideal? No. But if circumstances are such that we need to get the job done with these demographics, then we'll get the job done with these demographics. And honestly, the less you need people, the more likely they are to feel drawn to you, at least in my experience.

"Please, for God’s sake, hire non-EA creative talent" — I suspect this is very circumstantial. There are circumstances where you'll be able to delegate to non-EA creative talent and it'll work fine, but there will be other circumstances where you try this and you find that they just keep distorting the message. It's harder than you might think.

I agree re: 4 though. The expectations re: caveats depend heavily on the context of the post. 

An analogy: let's suppose you're trying to stop a tank. You can't just place a line of 6 kids in front of it and call it "defense in depth".

Also, it would be somewhat weird to call it "defense in depth" if most of the protection came from a few layers.

Feel reply to this comment with any suggestions about other graphs that I should consider including.

Create nice zones for spontaneous conversations (not sure how to do this well)


I've tried pushing for this without much success unfortunately.

It really is a lot more effort to have spontaneous conversations when almost all pairs are a one-on-one and almost all people by themselves are waiting for a one-on-one.

I've seen attempts to declare a space an area that's not for one-on-ones, but people have one-on-ones there anyway. Then again, organisers normally put up one or two small signs.

Honestly, the only way to stop people having one-on-ones in the area for spontaneous conversation might be to have an absurd number of big and obvious signs.

For most fellowships you're applying to a mentor rather than pursuing your own project (ERA is an exception). And, on the most common fellowships of a few months it's pretty much go, go, go, with little time to explore.

Load more