I am not a card carrying member of EA. I am not particularly A, much less E in that context. However the past few months have been exhausting in seeing not just the community, one I like, in turmoil repeatedly, while clearly fumbling basic aspects of how they're seen in the wider world. I like having EA in the world, I think it does a lot of good. And I think you guys are literally throwing it away based on aesthetics of misguided epistemic virtue signaling. But it's late, and I read more than a few articles, and this post is me begging you to please just stop.
The specific push here is of course the Bostrom incident, when he clearly and highly legibly wrote black people have lower intelligence than other races. And his apology, was, to put it mildly, mealy mouthed and without much substance. If anything, in the intervening 25 years since the offending email, all he seems to have learnt to do is forget the one thing he said he wanted to do - to speak plainly.
I'm not here to litigate race science. There's plenty of well reviewed science in the field that demonstrates that, varyingly, there are issues with measurements of both race and intelligence, much less how they evolve over time, catch up speeds, and a truly dizzying array of confounders. I can easily imagine if you're young and not particularly interested in this space you'd have a variety of views, what is silly is seeing someone who is so clearly in a position of authority, with a reputation for careful consideration and truth seeking, maintaining this kind of view.
And not only is this just wrong, it's counterproductive.
If EA wants to work on the most important problems in the world and make progress on them, it would be useful to have the world look upon you with trust. For anything more than turning money into malaria nets, you need people to trust you. And that includes trusting your intentions and your character.
If you believe there are racial differences in intelligence, and your work forces you to work on the hard problems of resource allocation or longtermist societal evolution, nobody will trust you to do the right tradeoffs. History is filled with optimisation experiments gone horribly wrong when these beliefs existed at the bottom. The base rate of horrible outcomes is uncomfortably large.
This is human values misalignment. Unless you have overwhelming evidence (or any real evidence), this is just a dumb prior to hold and publicise if you're working on actively changing people's lives. I don't care what you think about ethics about sentient digital life in the future if you can't figure this out today.
Again, all of which individually is fine. I'm an advocate of people holding crazy opinions should they want to. But when like a third of the community seems to support him, and the defenses require contortions that agree, dismiss and generally be whiny about drama, that's ridiculous. While I appreciate posts like this, which speak about the importance of epistemic integrity, it seems to miss the fact that applauding someone for not lying is great but not if the belief they're holding is bad. And even if this blows over, it will remain a drag on EA unless it's addressed unequivocally.
Or this type of comment which uses a lot of words but effectively seems to support the same thought. That no, our job is to differentiate QALYs and therefore differences are part of life.

But guess what, epistemic integrity on something like this (I believe something pretty reprehensible and am not cowing to people telling me so) isn't going to help with shrimp welfare or AI risk prevention. Or even malaria net provision. Do not mistake "sticking with your beliefs" to be an overriding good, above believing what's true, or acting kindly towards the world, or acting like serious members of a civilisation where we all need to work together. EA writes regularly about burnout from the sheer sense of feeling burdened with a duty to do good - guess what, here's a good chance.
In fact, if you can't see why sticking with the theory that "race X is inferior in Y" and "we unequivocally are in favour of QALY differentiation" together constitute a clear and dangerous problem, I don't know what to say. If you want to be a successful organisation that does good in the world, you have to stop confusing sophomoric philosophical arguments with actual lived concerns in the real world.
You can't sweep this under the rug as "drama of the day". I'm sorry, but if you want to be anything more than yet another NGO who take themselves a tad too seriously, this is actively harmful.
This isn't a PR problem, it's an actual problem. If one of the most influential philosophers and leaders of your movement is saying these things that are just wrong, it hurts credibility for any other sort of framework you might create. Not to mention the actual flesh and blood people who live in the year 2023.
It's one thing to play with esoteric thought experiments about the wellbeing of people in the year 20000. It's quite another to live in the year 2023. Everyone is free to analyse and experiment to explore any question they so choose, including this. However this is not that. It is starting from professing a belief, and saying you are okay doing so because there isn't any contrary evidence. That's not how science works, and that's not how a public facing organisation should work.
If he'd said, for instance, "hey I was an idiot for thinking and saying that. We still have IQ gaps between races, which doesn't make sense. It's closing, but not fast enough. We should work harder on fixing this." That would be more sensible. Same for the community itself disavowing the explicit racism.
By the way, it's insane that the Forum seems to hide this whole thread as if it is a minor annoyance instead of a death knell. The SBF issue I can understand, you were fooled like everyone else and its a black eye for the organisation, but this isn't that. And the level of condemnation that brought was a good way to react. This is much more serious.

I should say, I don't have a particular agenda here. This stream of consciousness is already quite long. A little annoyed perhaps that this is flooding the timeline and the responses from folks whom I'd considered thoughtful are tending towards debating weird theoretical corner cases, doing mental jiu-jitsu just to keep holding that faith a little longer. But mostly it's just frustration bubbling out as cope.
I just wish y'all could regain the moral high ground here. There are important causes that could use the energy. It's not even that hard.
I upvoted this post and think it's a good contribution. The EA community as a whole has done damage to itself the past few days. But I'm worried about what it would mean to support having less epistemic integrity as a community.
This post says both:
and
The first quote says believing X (that there exists a racial IQ gap) is harmful and will result in nobody trusting you. The second says X is, in fact, true.[1]
For my own part, I will trust someone less if they endorse statements they think are false. I would also trust someone less if they seemed weirdly keen on having discussions that kinda seem racist. Unfortunately, it seems we're basically having to decide between these two options.
My preferred solution is to -- while being as clear as possible about the context, and taking great care not to cause undue harm -- maintain epistemic integrity. I think "compromising your ability to say true, relevant things in order to be trusted more" is the kind of galaxy-brain PR move that probably doesn't work. You incur the cost of decreased epistemic integrity, and then don't fool anyone else anyway. If I can lose someone's trust by saying something true in a relevant context,[2] then keeping their trust was a fabricated option.
I'm left not knowing what this post wants me to do differently. When I'm in a relevant conversation, I'm not going to lie or dissemble about my beliefs, although I will do my best to present them empathetically and in a way that minimizes harm. But if the main thrust here is "focus somewhat less on epistemic integrity," I'm not sure what a good version of that looks like in practice, and I'm quite worried about it being taken as an invitation to be less trustworthy in the interest of appearing more trustworthy.
I've seen other discussions where someone seems to both claim "the racial IQ gap is shrinking / has no genetic component / is environmentally caused" and "believing there is a racial IQ gap is, in itself, racist."
I think another point of disagreement might be whether this has been a relevant context to discuss race and IQ. My position is that if you're in a discussion about how to respond to a person saying X, you're by necessity also in a discussion about whether X is true. You can't have the first conversation and completely bracket the second, as the truth or falsity of X is relevant to whether believing X is worthy of criticism.