Ben Jamin


Sorted by New

Topic Contributions


Free-spending EA might be a big problem for optics and epistemics

I think I am mostly comparing to how different my impression of the landscape of a few years ago is to today's landscape.

I am mostly talking about uni groups (I know less about how status-y city groups are), but I there were certainly a few people putting in a lot of hours for 0 money and not much recognition from the community for just how valuable their work was. I don't want to name specific people I have in mind, but some of them now work at top EA orgs or are doing other interesting things and have status now, I just think it was hard for them to know that this is how it would pan out so I'm pretty confident they are not particularly status motivated.

I'm also pretty confident that that most community builders I know wouldn't be doing their job on minimum wage even if they thought it was the most impactful thing they could do. That's probably fine, I just think they are less 'hardcore' than I would like.

Also being status motivated is not neccesarilly a bad thing, I'm confused about this but it's plausibly a good thing for the movement to have lots of status motivated people to the degree that we can make status track the right stuff. I am sure that part of why I am less excited about these people is a vibes thing that isn't tracking impact.

EA Houses: Live or Stay with EAs Around The World

My previous comment may well 'deserve' to be down voted, but given that it has been heavily down voted I would appreciate it a lot if some of the downvoters could explain why they downvoted the comment.

To PA or not to PA?

One thing that I feel this post underemphasises is just how high impact the top PAs are going to be.

If you believe that impact is extremely heavy tailed, some PAs (like Holden's) are probably going to have a far greater impact than the vast majority of high status EAs, even if you are on the more pessimistic end a PAs value add.

You also might be able to leverage not caring about status, it's plausible to me that some people that would are going to start mediocre organisations should actually try to force-multiply the best people and one reason they don't is beause of motivated reasoning/over-fitting to what the EA community assigns status to. If you care less about status you might be uniquely well positioned.

PAing for Liv/Igor and Daniela/Holden seem especially exciting to me. I am kind of averse to trying to control the status structures of EA but I think PAing for these people 'should' be higher status than running most orgs that OPP funds.

Ben Jamin's Shortform

I want to be clear that I am endorsing not only the sentiment but the drastic framing. At the end of the day, a few 100k here and there is literally a rounding error on what matters and I would much rather top researchers were spending this money of weird things that might help them slightly than we had a few more mediocre researchers who are working on things that don't really matter.

I certainly wouldn't say this about any researcher, if they could work in constellation/lightcone they have a 30% chance of hitting my bar. I am much more excited about this for the obvious top people at constellation/lightcone.

(If someone actually talked about things that make them 0.01% more productive, that suggests they have lost the plot.)

I don't really like this, presumably if impact is extremely heavy tailed we can get a lot of value from finding these actvities and a general aversion to this bcause it might waste mere money seems very bad. Things like optics are more of a reason to be careful but idk, maybe we should just make anonymous forum accounts to discuss these things and then actually take our ideas seriously.

EA Houses: Live or Stay with EAs Around The World

Sure, but it's pretty reasonable to think that Kat thinks that majority of value will come from helping longtermists given that that is literally that reason she set up nonlinear.

Also, EAIF will fund these things.

EA Houses: Live or Stay with EAs Around The World

I like this because it is a low overhead way for high impact people to organise retreats, holidays etc. with aligned people and this is plausibly very valuable for some people. It will also nudge people to look after themselves and spend time in nice plaes which on the current margin is maybe a good thing, idk.

Fwiw I think that LTFF would fund all of the 'example use cases for guests' anyway for someone reasoably high impact/value aligned anway, so I think this is more about nudges than actually creating opportunities that don't already exist.

Ben Jamin's Shortform

Sometimes the high impact game feels weird, get over it.

I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).

I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0.01% mroe productive (and I trust their judgment and value alignment) they are welcome to my money. Giving them my money is winning and as far as I am concerned it's a far better use of money than basically anything else I could do.

Yes this would feel weird, but am I really going to let my own feelings of weirdness stop me helping billions of people in expectation. That feels much more weird. a

Free-spending EA might be a big problem for optics and epistemics

Part of me is a bit sad that community building is now a comfortable and status-y option. The previous generation of community builders had a really high proportion of people who cared deeply about these ideas, were willing to take weird ideas seriously and often take a substantial financial/career security hit.

I don't think this applies to most of the current generation of  community builders to the same degree and it just seems like much more of a mixed bag people wise. To be clear I still think this is good on the margin, I just trust the median new community builder a lot less (by default). 

Free-spending EA might be a big problem for optics and epistemics

In your list of new hard-to-fake signals of seriousness I like.

Doing high upside things even if there's a good chance they might not work out and seem unconventional.

I think that this is underrated and as a community, we overemphasise actually achieving things in the real world meaning if you want to get ahead within EA it often pays to do the medium right but reasonable thing over the super high EV thing, as the weird super high EV thing probably won't work.

 I'm much more excited when I meet young people who keep trying a bunch of things that seem plausibly very high value and give them lots of information relative to people that did some ok-ish things that let them build a track record/status. Fwiw I think that some senior EAs do track these high EV high-risk things really well, but maybe the general perception of what people ought to do is too close to that of the non-EA world.

Load More