Non-EA interests include chess and TikTok (@benthamite). We are probably hiring: https://www.centreforeffectivealtruism.org/careers
Feedback always appreciated; feel free to email/DM me or use this link if you prefer to be anonymous.
I actually did that earlier, then realized I should clarify what you were trying to claim. I will copy the results in below, but even though they support the view that FTX was not a huge deal I want to disclaim that this methodology doesn't seem like it actually gets at the important thing.
But anyway, my original comment text:
As a convenience sample I searched twitter for "effective altruism". The first reference to FTX doesn't come until tweet 36, which is a link to this. Honestly it seems mostly like a standard anti-utilitarianism complaint; it feels like FTX isn't actually the crux.
In contrast, I see 3 e/acc-type criticisms before that, two "I like EA but this AI stuff is too weird" things (including one retweeted by Yann LeCun??), two "EA is tech-bro/not diverse" complaints and one thing about Whytham Abbey.
And this (survey discussed/criticized here):
This is a good point – I've (anecdotally) seen one organization "go off the rails" because of a staff member who was behaving unethically but the CEO didn't feel like they had a mandate to just fire them without going through a bunch of formal process.
I guess it's by definition hard to precisely describe when one should deviate from a standard process; perhaps "get feedback from a bunch of experts" is the best advice you could give a CEO in such a situation.
Ah yeah sorry, the claim of the post you criticized was not that FTX isn't mentioned in the press, but rather that those mentions don't seem to actually have impacted sentiment very much.
I thought when you said "FTX is heavily influencing their opinion" you were referring to changes in sentiment, but possibly I misunderstood you – if you just mean "journalists mention it a lot" then I agree.
My experience is that there are a bunch of metrics about startups which correlate with the founders' skill/effort better (though not perfectly) than exit value:
And most of these metrics are publicly available.
I actually don't know a ton of people who are in the category of "founded something that was ex-ante plausible, put multiple years into it, but it didn't work out" so I'm mostly speculating, but my somewhat limited experience is that people will usually put on their resume stuff like "founded and grew my start up to $10M/year ARR with 30 employees backed by Sequoia" and this is impressive despite them not exiting successfully.[1]
Though obviously ~100% of these founders would happily exchange that line on their resume for a fat check from having sold their company.
Ah yeah, certainly proving yourself in some way will make it easier for you to get funding.
Dumb question: have you considered immigrating to the US? The US has substantially more VC funding available than any other country.
we are seeing really very clear evidence that when people start getting informed, FTX is heavily influencing their opinion.
Thanks! Could you share said evidence? The data sources I cited certainly have limitations, having access to more surveys etc. would be valuable.
Thanks for the helpful comment – I had not seen John's dialogue and I think he is making a valid point.
Fair point that the lack of impact might not be due to attention span but instead things like having competing messages.
In case you missed it: Angelina Li compiled some growth metrics about EA here; they seem to indicate that FTX's collapse did not "strangle" EA (though it probably wasn't good).
Thoughts on the OpenAI Board Decisions
A couple months ago I remarked that Sam Bankman-Fried's trial was scheduled to start in October, and people should prepare for EA to be in the headlines. It turned out that his trial did not actually generate much press for EA, but a month later EA is again making news as a result of recent Open AI board decisions.
A couple quick points:
A collection of prediction markets about this event can be found here.
Note that the data collected here does not exclude the possibility that perception of EA was affected in some subcommunities, and it might be the case that some subcommunities (e.g. OpenAI staff) do have a changed opinion, even if the average person’s opinion is unchanged
Oops that was supposed to link to this sequence, updated now. (That sequence isn't a complete list of everything that I and others at CEA have done, but it's the best I know of.)