Upvoted because this comment was on -1 karma, I suspect unfairly given that the FTX Future Fund website says "Please post any questions you might have as public comments here" in lieu of a contact form.
Oh yes I know - with my reply I was (confusingly) addressing the unreceptive people more than I was addressing you. I'm glad that you're keen :-)
Nice. And when it comes to links, ~half the time I'll send someone a link to the Wikipedia page on EA or longtermism rather than something written internally.
Maybe you want to select for the kind of people who don't find it too boring! My guess, though, is that the project idea as currently stated is actually a bit too boring for even most of the people that you'd be trying to reach. And I guess groups aren't keen to throw money at trying to make it more fun/prestigious in the current climate... I've updated away from thinking this is a good idea a little bit, but would still be keen to see several groups try it.
Agreed, hence "I don't even think the main aim should be to produce novel work". Imagine something between a Giving Game and producing GiveWell-standard work (much closer to the Giving Game end). Like the Model United Nations idea - it's just practice.
Aye and EA London did a smaller version of something in this space focused on equality and justice.
I wonder if the suggestion here to replace some student reading groups with working groups might go some way to demonstrating that EA is a question.
I don't even think the main aim should be to produce novel work (as suggested in that post); I'm just thinking about having students practice using the relevant tools/resources to form their own conclusions. You could mentor individuals through their own minimal-trust investigations. Or run fact-checking groups that check both EA and non-EA content (which hopefully shows that EA content compares pretty well but isn't perfect...and if it doesn't compare pretty well, that's very useful to know!)
| I think the solution here is to create boundaries so you're not optimizing against people.
I prefer 80,000 Hours' 'plan changes' metric to the 'HEA' one for this reason (if I've understood you correctly).
| Separation from friends and loved ones: Happens accidentally due to value changes.
I hope by this you mean something like "People in general tend to feel a bit more distant from friends when they realise they have different values and EA values are no exception." But if you've actually noticed much more substantial separation tending to happen, I personally think this is something we should push back against, even if it does happen accidentally. Not just for optics' sake ("Mentioning other people and commitments in your life other than EA might go a long way"), but for not feeling socially/professionally/spiritually dependent on one community, for avoiding groupthink, for not feeling pressure to make sacrifices beyond your 'stretch zone.'
When I was working for EA London in 2018, we also had someone tell us that the free books thing made us look like a cult and they made the comparison with free Bibles.