Currently doing local AI safety Movement Building in Australia and NZ.
I wasn’t claiming that the current organisations haven’t had an impact, but that they haven’t really provided a path to solving this issue. Then again, maybe “solving” is a mistaken frame.
Apologies to Luke if this comment isn't helpful. If that's the case, just let me know. Happy to remove if I'm taking the conversation off-course.
I hope some of the other commenters have answers for you, but tbh, I don't think the limitation here is donations.
This problem seems wildly intractable, but we could be wrong.
Instead, I suspect the limitation would be more gather a group of intelligent, persistent and creative EA's to dedicate serious time to rethinking this whole issue from the ground up in case there's anything that has been missed. I wouldn't put high odds on this turning up much, but it seems worth a shot.
I'm skeptical.
I've read their other comments. The initial comment sounded somewhat plausible, but their other comments sounded less like what I'd expect someone in that position to sound like.
That makes sense. However, I do think that showing that would be less discouraging for anyone around the bar, which are probably the people most important not to discourage (people significantly below would be wasting their time, people significantly above are more likely to be confident enough to apply).
I suspect there would be less potential discouragement effect if you listed some grants that were just over the bar?
I’ll have a think about which women I know whom I should suggest apply for this program. Do you have any more details about the kinds of candidates that you’re looking for?
I wouldn’t be surprised if the rise of AI safety has played a role in this focus.
Let’s suppose your main focus is global charity. Well, you need high quality analysis, but you don’t need that many analysts. GiveWell is small.
On the other hand, AI safety has a huge demand for talent and it’s only recently that some of the research direction, like interpretability, became more scalable.
One difference between our perspectives is that I don't take for granted that this process will occur unless the conditions are right. And the faster a movement grows, the less likely it is for lessons to be passed on to those who are coming in. This isn't dismissing these people, just how group dynamics work and a reality of more experienced people having less time to engage.
I want to see EA grow fast. But at a high enough speed, I'm not sure what exactly, at which this will most likely degrade our culture. All this said, I'm less concerned about this than before. As terrible as the FTX collapse and recent events have been, I wouldn't be surprised if we no longer have to worry about potentially growing too fast.