Thinking, writing, and tweeting from Berkeley California. Previously, I ran programs at the Institute for Law & AI, worked on the one-on-one advising team at 80,000 Hours in London and as a patent litigator at Sidley Austin in Chicago.
I appreciate the effort and ambition you're putting into this and endorse you doing the kind of outreach you're most excited about. That said, I doubt this is nearly as valuable as it looks on paper, so groups shouldn't default to replicating it.
So what we have here is a pledge that says that when you enter the workforce and have a steady income, you will donate 1% of your income to charities that you care about.
[emphasis added]
Based on this and the absence absence of meaningful follow up, I'd guess these pledges are worth ~5% of 300 high-touch pledges.
It seems like people are going to get an email from GWWC at some point in the future (maybe not even that?) which may or may not successfully remind them of this brief interaction, which may or may not motivate them to click through to the site, which is quite unlikely to convince anyone to donate to a highly effective charity.
Shifting some portion of your efforts to follow up seems like the right move. Getting one real EA 1% pledger up to 5% would be worth 80 of these pledges for example and seems doable.
I avoided opening this post because I was worried it'd be a sort of "we're entitled to Anthropic's money" vibe I've gotten from some other posts, but I'm happy to have been proven wrong. This is a very clear outline of the present problem(s) EA/AIS are facing with creating projects that are worth funding.
I would have predicted the positive press and basically think this would "work" today if these conditions were met:
I agree you on the overall downsides though. This sets a bad precedent that will be misused by many and burn a ton of social trust that is ultimately more important.
The most charitable explanation of the tension here is that people just disagree with you about what is most impactful. I appreciate your transparency in considering whether aesthetics and nostalgia for a previous era of EA might be driving your unease.
Ultimately, it is better to debate the merits of specific interventions than general vibes. I think even the Anthropic folks would agree that, e.g., moving to SF is purely instrumental to some more specific theory of change that may or may not have merit.