I think I am mostly comparing to how different my impression of the landscape of a few years ago is to today's landscape.
I am mostly talking about uni groups (I know less about how status-y city groups are), but I there were certainly a few people putting in a lot of hours for 0 money and not much recognition from the community for just how valuable their work was. I don't want to name specific people I have in mind, but some of them now work at top EA orgs or are doing other interesting things and have status now, I just think it was hard for them to know... (read more)
My previous comment may well 'deserve' to be down voted, but given that it has been heavily down voted I would appreciate it a lot if some of the downvoters could explain why they downvoted the comment.
One thing that I feel this post underemphasises is just how high impact the top PAs are going to be.
If you believe that impact is extremely heavy tailed, some PAs (like Holden's) are probably going to have a far greater impact than the vast majority of high status EAs, even if you are on the more pessimistic end a PAs value add.
You also might be able to leverage not caring about status, it's plausible to me that some people that would are going to start mediocre organisations should actually try to force-multiply the best people and one reason they don't i... (read more)
I want to be clear that I am endorsing not only the sentiment but the drastic framing. At the end of the day, a few 100k here and there is literally a rounding error on what matters and I would much rather top researchers were spending this money of weird things that might help them slightly than we had a few more mediocre researchers who are working on things that don't really matter.
I certainly wouldn't say this about any researcher, if they could work in constellation/lightcone they have a 30% chance of hitting my bar. I am much more excited about this... (read more)
Sure, but it's pretty reasonable to think that Kat thinks that majority of value will come from helping longtermists given that that is literally that reason she set up nonlinear.
Also, EAIF will fund these things.
I like this because it is a low overhead way for high impact people to organise retreats, holidays etc. with aligned people and this is plausibly very valuable for some people. It will also nudge people to look after themselves and spend time in nice plaes which on the current margin is maybe a good thing, idk.
Fwiw I think that LTFF would fund all of the 'example use cases for guests' anyway for someone reasoably high impact/value aligned anway, so I think this is more about nudges than actually creating opportunities that don't already exist.
Not all EAs work on the long-term future
I have been in lots of conversations recently where people expressed their discomfort in the longtermist communities spending (particularly at events).
I think that my general take here is "yeah I can see why you think this but get over it". Playing on the high impact game board when you have $40B in your bank account and only a few years to use it involves acting like you are not limited financially. If top AI safety researchers want sports cars because it will help them relax and therefore be more 0... (read more)
Part of me is a bit sad that community building is now a comfortable and status-y option. The previous generation of community builders had a really high proportion of people who cared deeply about these ideas, were willing to take weird ideas seriously and often take a substantial financial/career security hit.I don't think this applies to most of the current generation of community builders to the same degree and it just seems like much more of a mixed bag people wise. To be clear I still think this is good on the margin, I just trust the median new community builder a lot less (by default).
In your list of new hard-to-fake signals of seriousness I like.
Doing high upside things even if there's a good chance they might not work out and seem unconventional.
I think that this is underrated and as a community, we overemphasise actually achieving things in the real world meaning if you want to get ahead within EA it often pays to do the medium right but reasonable thing over the super high EV thing, as the weird super high EV thing probably won't work. I'm much more excited when I meet young people who keep trying a bunch of things that seem pl... (read more)