John Salter

376 karmaJoined Sep 2022


I hope this answer inspires others as Will's inspired you.


^ If you can upskill in the form of doing a project for some UK org, then this might be worth considering

Because it wouldn't be a very intelligent move from the AGI. It'd be way easier for an AGI to set its own reward function to infinity by manipulating its own circuitry than it would be to warp the universe to its precise specifications. 


We spoke a little at EAG London about how people underestimate the mental health challenges people face in EA, especially among the most senior people. You indicated a willingness to talk about it publicly. If you're still up for it, could you tell us more about your own personal mental health over the past few years and your perceptions of what mental health is like amongst other effective altruists in leadership positions?

It might be worth introducing Hannah at the start of the post. It's likely that many prospective users don't know who she is.

Thank you! I underestimated the rate at which sample size diminishes in returns, or perhaps over-valued much larger sample size increases you'd need to correct for multiple comparisons. 

I followed the logic re: the statistical correction covariance but still found some of the results surprising. Especially the having to leave work part; I would have predicted that depression was more disabling with respect to workplace performance than anxiety. It's thus very surprising to me that it's only anxiety (once the covariance is corrected for) that correlates with having to leave work. I might be underestimating how hopeless people with depression feel, and thus their unwillingness to make life changes, or how high-functioning the average depressive is. Was this a result you found surprising?

It's often easier to get responses from the most senior people in a field.

1. Most people are too intimidated to get in touch with them
2. They're senior for a reason - they tend to be way more productive and opportunity seeking
3. They have VAs, secretaries, and other people to bring serious requests to their attention.

I work in global mental health, and am looking for charities to refer clients to me. The two best-connected people in my field (according to GPT-4) are Dr Vikram Patel and Dr Shekhar Saxena. I sent out ~50 identical cold emails to people I thought could connect me to relevant charities / hospitals etc. Vikram and Saxena were the only two people to reply! 

I've also seen this argued by Tim Ferris and other highly productive people, but it resonated so poorly with my prior beliefs that I didn't update sufficiently. The implications here are huge - it could be way easier to gain access to influential people than the average EA perceives, and influence is power-law distributed! 

CEA is widely perceived as grossly ineffective whereas AMF is perceived as highly effective. 

CEA has three major projects I'm aware of, and executes poorly on all of them.

EA Forum

The glorified subreddit you're reading this on costs $2M per year to run, ~$30 per hour of user engagement on non-community posts. By their own admission, they could run it for 10x less than this.

EA Globals

They won't reveal what they've spent on them, but there's good reason to think it's too much. At EAG London, they had three staff standing behind each 1m x 4m table. I asked 5 of them to rate their boredom on a scale of 1-10. The lowest answer I got was 12. They insist on throwing workshops, which the vast majority of users perceive to be a waste of time - it's the one-to-one's that produce the value. If they embraced that fact, they'd be able to choose much cheaper venues. 

Community Building and Health

Their own community builders perceive them as incompetent. Most people are too scared to criticize them under their own names because they fear reprisals. You likely underestimate how poorly they are perceived for this reason. EAs don't trust them, which makes them the worst people you could put in charge of community health. 

Other Effects

They're directly or indirectly responsible for most of EA's scandals, Ni****gate to Castlegate. They were in the best position to prevent FTX's ponzi scheming and did nothing, they may even have been complicit. 


Load more