Newly a writing intern for Nonlinear and a module-creator for Clearer Thinking. Previously I was doing a PhD in Classics. I live in London and I like writing things.
I humbly submit my application, having just formally dropped out of a PhD I was doing for 6 years (it was one of those long American ones). I de-facto dropped out/stopped working on it about a year ago. I feel very good about this decision - continuing would have meant wasting another miserable year or so just to get a credential. I'm doing EA writing stuff now which seems like a better use of my time.
I broadly agree with 5 and 6.
Re 3, 'There is anticorrelation between the amount of time people have to post on EA Forum and the quality of person.' - this makes me wince. A language point is that I think talking about how 'good quality' people are overall is unkind and leads to people feeling bad about themselves for not having such-and-such an attribute. An object level point is I don't think there is an anticorrelation - I think being a busy EA org person does make it more likely that they'll have valuable takes, but not being a busy-EA-org-person doesn't make it less likely - there aren't that many busy-EA-org-person jobs, and some people aren't a good fit for busy jobs (eg because of their health or family commitments) but they still have interesting ideas. Re 7: I'm literally working on a post with someone about how lots of people feel too intimidated to post on the Forum because of its perceived high standards! So I think though the Forum team are trying to make people feel welcome, it's not true that it's (yet) optimized for this, imo.There's a kind of general problem whereby any messaging or mechanism that's designed to dissuade people from posting low-quality things will (a) just not work on some people - some people just have a lot of confidence in their not-very-good opinions, shrug, and (b) dissuade people who would post high-quality things, but who have impostor syndrome or are perfectionist or over self-critical. I think the number of people that the mechanism works as intended on - ie people who would have posted a low quality post but are now dissuaded from it - is probably pretty low. Since there are lots of people in EA with impostor syndrome/perfectionism/over-scrupulosity, I'm pretty in favour of the Forum having a 'welcoming' vibe over a We Are Very Serious and Important vibe.... because I'd rather have more good takes and more bad takes, than get rid of the bad takes and also get rid of good takes from impostors.
Hmm yes, that's interesting! I'd be interested to know how much this happens.
Yeah, I was thinking of that post! Possibly the title to this post shouldn't even include 'should', but instead 'EAs can, if they want to...'But then again, although I anticipate this mainly being done by people who feel some intrinsic motivation, maybe I do think that it's something the EA community "should" do more of?I think it wouldn't be a bad idea for EA orgs to do some of this, though as ColdButtonIssues said above, it might be a good idea to avoid doing it for extremely divisive issues.
Yeah this is a really good point! Something I was kind of aware of while writing this is that I'm a hypocrite - I've never done this. It's probably really hard to do, and probably one reason why people don't do it that much is just 'it will take me ages and ages, no-one is paying me to do, and I have a day job/studies/life to deal with'. I would definitely sign up for, like, a 'cost-effectiveness analysis 101 fellowship'.
Thanks Robin! Very good points.
This is a good point, thanks!
Ah, that's interesting! :( that no-one wanted to pay you to do it. Why do you think the range of cost-effectiveness is greater in global health than in many other areas?
Thanks! That's a really good example.
Yeah, that is a good point. It makes a lot of sense for EA orgs to avoid divisive issues, particularly if they are not among the most pressing anyway. A friend pointed out elsewhere that if producing LICAs was the norm for institutions, you might end up with institutions producing recommendations on both sides of a contentious social issue - e.g., how to effectively improve abortion access, and how to effectively reduce it. This could be bad both for PR reasons (*everyone* would hate us!) and because different sets of EAs are essentially doing work that cancels each other out.