-3Joined Jan 2023


I don't understand what you think FLI did wrong. Looks like their due diligence process worked as intended.

I also don't condemn Nick Bostrom's original email and don't see what's racist about it.

If someone calls me, personally, fat, I will see it as an insult in some contexts. But if someone made a true statement about the average BMI of people leaving in the US, it would be ridiculous for me to take it as an insult. 

You call the lanuage used vulgar. I call it efficient. You can call someone "fat", or "overweight", or "having a BMI substantially above average". You can call someone "dumber", or "stupider", or "less intelligent", or "having a smaller IQ score".  The number of letters increases, but the intended meaning stays the same. Once we make an euphemism for something society deems undesirable, negative connotations grow back over time, so we have to invent more and more elaborate euphemisms. It wouldn't do to be too rude, but we have to stop somewhere before the number of syllables gets out of hand. "Stupid" or "overweight" seem like reasonable compromises to me.

It is true that in some contexts true statements can be used to coordinate violence against a group of people, and it is reasonable to be concerned in these situations. But there are also contexts where people need to communicate clearly and efficiently, without adding a thousand disclaimers to every factual statement they make, because it's necessary to solve problems. You don't take offence if your doctor tells you you're overweight, or if a scientist writes a paper discussing possible causes of rising obesity rates. Writing by rationalists and EAs should be treated the same way. Given the track record of EA in general and Nick Bostrom in particular, and the explicit clarification in his letter that his statement should be understood literally and not as an expression of hatred towards Black people, it is crazy to assume that the statement was intended to coordinate some racial violence. EA community did a lot to help Black people.

EA and rationalist communities have always leaned towards decoupling norms in a conversation. Following decoupling norms means you must understand statements literally and avoid unwarranted inferences. If someone says X, and you believe that X implies Y, or you believe that the speaker believes that X implies Y, you are not allowed to just act like they said Y. You should first clarify. 

Epistemic rigor of EA is valuable and unique. It is what distinguishes EA from other do-gooder organizations. I was, for example, a part of Russian opposition movement. It was outright taboo to discuss whether this or that tactic was effective (what if we looked into it and it turned out ineffective? It would be so discouraging to everyone!). I rarely saw anyone express less than complete confidence in our victory (surely making different predictions would just mean you were rooting for the other side). I've seen people try to organize a popular opposition movement while reassuring each other that they were NOT trying to gain power (surely only bad people could have goals like that...). You all saw the outcome of that endeavor. 

This is why I am extremely disappointed and dismayed to see EA community violate its own social norms and surrender the unique value it brings to the world, just because some Twitter people are angry or something.