alexrjl

Twitter, Metaculus I work on the 1-on-1 team at 80,000 hours talking to people about their careers; the opinions I've shared here (and will share in the future) are my own.

Topic Contributions

Comments

Blueprint for billionaires?

There are a few organisations who work with high net worth individuals to deploy their money, and my guess is that anyone with this kind of capital would be able to speak to all of them fairly easily.

https://www.longview.org/ might be interesting for you to check out, as well as https://founderspledge.com/.

If it's actually 100B though, that's bigger than the two biggest EA-adjacent foundations which currently exist, so talking to either of them would be sensible.

https://www.openphilanthropy.org/ 
https://blog.ftx.com/blog/ftx-foundation/ 

Bad Omens in Current Community Building

The comment below is made in a personal capacity, and is speaking about a specific part of the post, without intending to take a view on the broader picture (though I might make a broader comment later if I have time).

Thanks for writing this.  I particularly appreciated this example:

A friend of mine at a different university attended the EA intro fellowship and found it lacking. He tells me that in the first session, foundational arguments were laid out, and he was encouraged to offer criticism. So he did. According to him, the organisers were grateful for the criticism, but didn’t really give him any satisfying replies. They then proceeded to build on the claims about which he remained unconvinced, without ever returning to it or making an effort to find an answer themselves.

I'm pretty worried about this. I got the impression from the rest of your post that you suspect some of the big picture problem is community builders focusing too much on what will work to get people into AI safety,  but I think this particular failure mode is also a huge issue for people with that aim. The sorts of people who will hear high-level/introductory arguments and immediately be able come up with sensible responses seem like exactly the sorts of people who have high potential to make progress on alignment. I can't imagine many more negative signals for bright, curious people than someone who's meant to be introducing an idea not being able to adequately respond* to an objection they just thought of.

Though, to be fair, 'hang on a sec, let me just check what my script says about that objection' might actually be worse...


*To be clear, 'adequately responding' doesn't necessarily mean 'is so much of an expert that will just come up with a perfect response on the spot'. It's fine to not know stuff, and it's vital to be able to admit when you don't. Signposting to a previous place the question has been discussed, or knowing that it will be covered later (if e.g. this comes up in a fellowship) if that is the case, both seem useful. It seems important to know enough about common questions, objections, and alternative viewpoints to be able to do this the majority of the time. If it's genuinely something that the person running the session has never heard, this is exactly the time to demonstrate good epistemics - being willing to seriously engage, ask follow-up questions, and trying to double crux.

Help Me Choose A High Impact Career!!!

It's flattering to see that this was in part prompted by my post! 

Without trying to lean too hard into this tweet, I do actually think it might be worth linking to a googledoc version of this piece which has comment access enabled. Being able to comment on specific parts to ask for clarification and/or to respond to others, is pretty useful, especially for something that's more than a few paragraphs long.

Brief Presentation and Considerations for an EA Common Application

I found the context of the post kind of hard to understand and think the introduction is probably the section most worth editing. In particular, the "there's an opportunity here" framing seemed to clash a bit with "this was almost funded by a major grantmaker" (emphasis mine).

As almost means wasn't, it's not super clear how big an update people should make about this without more context on the funding decision. If OP is making the case for this to happen, I think it might be better to just frame the post more clearly as "this is why I think you should found this thing, and I'll connect you to people who are excited to fund it if you think you can".

My bargain with the EA machine

I had similar thoughts, discussed here after I tweeted about this post and somebody replied mentioning this comment.

(Apologies for creating a circular link loop, as my tweet links to this post, which now has a comment linking to my tweet)

Aaron Gertler's Shortform

Sounds right to me! I'm reading worth the candle at the moment :)

Aaron Gertler's Shortform

If be keen to hear right how you're defining the genre, especially when the author isn't obviously a member of the community. I loved worm and read it a couple of years ago, at least a year before I was aware rational fiction was a thing, and don't recall thinking "wow this seems really rationalist" so much as just "this is fun words go brrrrrrrr"

FTX/CEA - show us your numbers!


If this is too time-consuming for the current FTX advisers, hire some staff 

 

Hiring is an extremely labour and time intensive process, especially if the position you're hiring for requires great judgement. I think responding to a concern about whether something is a good use of staff time with 'just hire more staff' is pretty poor form, and given the context of the rest of the post it wouldn't be unreasonable to respond to it with 'do you want to post a BOTEC comparing the cost of those extra hires you think we should make to the harms you're claiming?'

EA Global: London 2022

(not an organiser but live in london)

I've been recommended https://www.expresstest.co.uk/, and also biogroup in shoreditch. Both offer same-day results.

The Vultures Are Circling

Don't people have the option to take it as a lump sum? If that is the case, presumably if they are willing to game the system to get the money they will not be particularly persuaded by a clear instruction to "only spend it on education".

Load More