Trigger warning: racism.
I personally found this letter incredibly difficult to read. Beyond the content of the email, the apology is also terribly written, and reads like Nick, an intellectual leader in EA and longtermism, might still hold these views today to some degree. It also reads like Nick is primarily just trying to do damage control for using a racial slur, or preemptive PR work for some other reason, as opposed to focusing on the harms he may be contributing to, and the folks he is apologizing to. In this context, this also sounds like a dog-whistle:
Are there any genetic contributors to differences between groups in cognitive abilities? It is not my area of expertise, and I don’t have any particular interest in the question. I would leave to others, who have more relevant knowledge, to debate whether or not in addition to environmental factors, epigenetic or genetic factors play any role.
Myself and other EAs I know are worried about professional reputational risks of continued association with the EA movement or longtermism. This is not just a PR risk, and despite my view that this reflects terribly on Nick and any comms experts who may have been involved in this, I don't want this to imply that the PR angle is what we should be primarily concerned about here-it isn't! But whether or not one of the leaders of EA has held racist views for decades, and whether he still basically holds them today is important.
It has real implications for the movement's future, including selection effects on people who may become more uncertain about the views that intellectual leaders of the EA/longtermism movement hold (and by extension, its intellectual foundations), whether EA is a community for "people like them", and whether EA is a movement that is well-equipped to preserve a future for all of humanity. Even if they aren't uncertain, they may be more reluctant to take risks to continue or become more outwardly involved in an increasingly controversial social movement. This may also affect the view of current and prospective donors to EA causes.
These are not concerns held solely by "EA outsiders" or those who are already unsympathetic to EA.
Reactions on Twitter-read on at your own peril!
(The EA forum seems to default to strong-upvotes on your own posts. I don't know why this is, but I'll probably change mine to a normal upvote if this post gets some engagement.)
Some historical context on this issue. If Bostrom's original post was written around 1996 (as I've seen some people suggest), that was just after the height of the controversy over 'The Bell Curve' book (1994) by Richard Herrnstein & Charles Murray.
In response to the firestorm around that book, the American Psychological Association appointed a blue-ribbon committee of 11 highly respected psychologists and psychometricians to evaluate the Bell Curve's empirical claims. They published a report in 1996 on their findings, which you can read here, and summarized here. The APA committee affirmed most of the Bell Curve's key claims, and concluded that there were well-established group differences in average general intelligence, but that the reasons for the differences were not yet clear.
More recently, Charles Murray has reviewed the last 30 years of psychometric and genetic evidence in his book Human Diversity (2020), and in his shorter, less technical book Facing Reality (2021).
This is the most controversial topic in all of the behavioral sciences. EAs might be prudent to treat this whole controversy as an information hazard, in which learning about the scientific findings can be s... (read more)
A short note as a moderator (echoing a commenter): People (understandably) have strong feelings about discussions that focus on race, and many of us found the linked content difficult to read. This means that it's both harder to keep to Forum norms when responding to this, and (I think) especially important.
Please keep this in mind if you decide to engage in this discussion, and try to remember that most people on the Forum are here for collaborative discussions about doing good.
If you have any specific concerns, you can also always reach out to the moderation team at email@example.com.
The original email (written 26 years ago) was horrible, but Bostrom's apology seems reasonable.
If you look at the most horrible thing that
everyeach person has done in their entire life, it's likely that almost everyone (in that age) has done things that are at least as horrible as writing that email.
The OP reads as if it was optimized to cause as much drama as possible, rather than for pro-social goals.
First, I don't think I've ever expressed a view as horrible as that. I don't want to make myself out to be a saint or something, and maybe I've done more harmful if less obviously spicy things (e.g., I ate a bunch of animals in my youth). But I'm reflexively sceptical of defences which are like "oh, surely everyone makes mistakes". I think it's ok to have high standards, and lots of people get through their youth managing not to make egregiously racist statements.
But this isn't my crux; my crux is that the apology doesn't ring sincere to me. He says lots of words about repudiating the email, but like...I don't understand why he said it at the time, and what changed in his opinions to make him apologize for it at the time, and reject it now. I agree with some other commenters that it's not clear that he thinks this is object-level bad vs PR-bad.
Doesn't the first sentence of his old email make this part fairly clear? It sounds like he's talking about the classic edgelord thing of enjoying the tension between intuitive repugnance and (what he took to be) logical truth on a strictly literal reading, when divorced of all subtext (which is presumably not what any reasonable person would ordinarily take the claims in question to communicate). Perhaps similar to how many philosophers find logical paradoxes invigorating. (Cf. Scott Alexander's classic post on related issues.)
That's not to defend it, and I agree his apology isn't sufficiently clear about why his particular example was so egregiously poorly-chosen. But it does strike me as most likely stemming from neuro-atypicality rather than racist intent, for whatever that's worth. (Many understandably care more about racist effects than racist intent, but I mention the latter here since you seem be to be asking about Bostrom's motivations, and that does seem relevant to assessments of blameworthiness.)
(Not sure whether this clarification is needed, but just in case...)
When I wrote "If you look at the most horrible thing that every person has done in their entire life[…]" I meant: "If for every person you look at the most horrible thing that that person has done[…]" (I.e. it can be a completely different thing for each person.) I've edited my comment to make that clear.
First all, I think it's clearly much worse than some mentioning the n-word directly rather than replacing it with the string "n-word", especially in the 90s social context.
Secondly, I think that the focus on labeling people who lean towards there being a genetic difference in population means as bad is mistaken given that the threat is actually people who try to leverage this claimed difference politically or attempt to inject their belief in this difference into as many conversations as possible. I think once we have in mind precisely which subset of people we should be worried about, then my position on what is worse ends up being quite natural.
I disagree with this, I think the apology, presumably carefully considered, is as bad as, if not worse, than the original email (at least, the apology contributes to a larger proportion of my negative update than the email written 26 years ago). I also think the apology was poorly written, and I would be surprised if this was signed off by a PR or comms expert.
Happy to hear constructive feedback, and am curious about what gives you this impression. I think there are clear ways this post could have been worded and framed in a way that would cause more drama. I'm sharing in part because I care about the EA community, and I think this may be useful information for the EA community to know about and engage with, and in part because the individual who shared this post with me found it disconcerting that no one had spoken out against it yet. I resonated with this, and didn't want other community members to feel similarly.
The people who I discussed this with prior to posting agreed that we didn't want this post to prompt object level IQ debates, which are usually unproductive, but also not highlight the PR angle too much, since as other commenters alluded to, this isn't and shouldn't be seen as the primary concern.
To add one more person's impression, I agree with ofer that he apology was "reasonable," I disagree with him that your post "reads as if it was optimized to cause as much drama as possible, rather than for pro-social goals," and I agree with Amber Dawn that the original email is somewhat worse than something I'd have expected most people to have in their past. (That doesn't necessarily mean it deserves any punishment decades later and with the apology –non-neurotyptical people can definitely make a lot of progress between, say, early twenties and later in life, in understanding how their words affect others and how edginess isn't the same as being sophisticated.)
I think this is one of these "struggles of norms" where you can't have more than one sacred principle, and ofer's and my position is something like "it should be okay to say 'I don't know what's true' on a topic where the truth seems unclear ((but not, e.g., something like Holocaust denial))." Because a community that doesn't prioritize truth-seeking will run into massive troubles, so even if there's a sense in which kindness is ultimately more important than truth-seeking (I definitely think so!), it just doesn't make sens... (read more)
It changes the emphasis a bit from "written evidence" (and "expressed worldviews") to "anything whatsoever."
E.g., if classrooms in 2005 had CCTV, you could find a video of my 14-year-old self deliberately mispronouncing someone else's name to make it sound dumb and making a comment about them having "girly" hair after someone else had already started making fun of him. I think that video would be similarly hard to watch as the original Bostrom email is hard to read.
edit: At least on some dimensions of "hard to watch"? I understand the view that Bostrom's comments were much worse, but I think there's something especially jarring about expressed lack of empathy when the person who's being hurt is right in front of you, as opposed to saying dumb stuff in a small/closed setting to be intellectually edgy.
Unless people here have a far better story than "Eugenics is horrible because eugenics!" behind their usage of the word 'horrible' with respect to Bostrom's words I suggest they stop using it. This is the EA forum after all and we ought to do better here than circular logic.
Skimming the comments so far, I'd appreciate if people would keep the "Be kind" part of the forum norm trifecta more in front of their minds:
This post touches on topics that are very emotionally difficult and controversial for many. I tentatively recommend that if you're very angry or upset, it might be sufficiently difficult to be civil and charitable such that you might want to hold off on immediately engaging a lot in the comments here. Maybe first:
(I downvoted a few comments because I think they failed to be charitable and/or civil. But as I think this specific topic is very difficult to deal with, I hope the downvotes will not generally discourage people from engaging in this forum as I believe they generally come with good intentions.)
I agree with the vibe of this but disagree with the specific advice, maybe? I think if we discourage people from commenting when they're feeling strong emotions, we miss out on valuable information. You suggest waiting for a few hours, but first, I think emotional first reactions are information (it's information if something makes people angry! info about the values of the EA movement, or about how bad Bostrom's comments are, for example); and secondly, some people might just not come back, or they might never cease to be emotional about the issue. Communicating in the standard detached Forum way might just feel dishonest to them. This means that the consensus becomes skewed to those who are less emotionally-activated by the issue. So in situations like this, people who are more forgiving or who think it's not that bad or who just have more muted emotions or more reserved communication styles will dominate the discussion.
Amber -- I strongly disagree with this take. Almost everything that EAs try to talk about dispassionately and objectively could be talked about reactively and emotionally with great intensity, if we didn't show enormous self-control and self-awareness -- and that would ruin the entire ethos and culture of EA Forum.
Everyone who really, deeply values animal welfare could react emotionally to every discussion of that topic. Every who really, deeply dreads global thermonuclear war could react emotionally to every discussion of that topics. And so on. The whole point of the EA movement is to try to grapple with extremely large-scale, high-stakes problems that most people can't think about rationally or empirically, using reason and evidence as best we can.
There's nothing intrinsically or uniquely emotional about race differences issues, apart from the current cultural context that racism in Western liberal academic cultures is uniquely stigmatized compared to every other moral failing in modern life.
Thanks for the pushback (I also appreciated Amber's response). I agree that there's a risk of taking the direction of my comment too far, and I agree that anger is a fully valid emotion and it's fully valid and informative/useful to communicate it here.
What I do still believe is that anger makes productive discourse more difficult, and I think that many comments here are cases of that happening. When I get angry, I'm less patient, I feel more like I'm in a fight and that I want to win an argument as opposed to understand the situation better and contribute my perspective to a shared process of understanding and decision-making. In case you're familiar, in EA terms I think that anger moves me away from a "scout mindset" and towards a "soldier mindset".
I'm currently not convinced that in this discussion, and in the EA forum in general, sharing emotions is discouraged to a degree that is worrying and discourages affected groups and individuals, and I'd be sad if that impression is false. What I see as discouraged is uncivility, uncharitability and snark, and I suspect it only seems like there are more downvotes for emotional comments because of the "anger -> impatience" mechanism.
As you’re reading this old email and the ‘apology’, instead of diving into debates about race, IQ, consequentialism, reputation and PR, please think about how black EAs and longtermists are feeling reading all of this.
I don't really like this thing where you speak on behalf of black EAs.
I think you should let black EAs speak for themselves or not comment on it.
In my experience, there seems to be distortionary epistemic effects when someone speaks on behalf of a minority group. Often, the person so speaking assigns them harms, injustices or offenses that the relevant members of those groups may not actually endorse.
When it's done on my behalf, I find it pretty patronising, and it's annoying/icky?
I don't want to speak for black EAs but it's not clear to me that the "hurt" you mention is actually real.
A few points that I think are important to make anytime conversation gets anywhere near the object level on this topic:
With all that being said, is there any place at all for the object level question in our discourse? I don’t know.
I haven't had a chance to read the whole blog post, but note for general information David Thorstad's blog post on this, which alleges:
Unfortunately, Bostrom’s email was not an isolated incident. The Extropians were widely involved in a number of explicitly racist, sexist and otherwise lamentable incidents that Bostrom cannot possibly have failed to be aware of at the time, and many of their former members occupy high positions within the effective altruism movement today.
(apologies for formatting, on mobile)
Want to note on this thread that CEA has published a statement on this: "Effective altruism is based on the core belief that all people count equally. We unequivocally condemn Nick Bostrom’s recklessly flawed and reprehensible words. We reject this unacceptable racist language, and the callous discussion of ideas that can and have harmed Black people. It is fundamentally inconsistent with our mission of building an inclusive and welcoming community."
Ugh. Painful to read.
Here as everywhere the key, to keep sane, is to hold a number of thoughts simultaneously:
1) The original email is horribly racist and indefensible. Just awful.
2) The author seems to have evolved since and should be given the benefit of the doubt.
3) Sincerely apologizing for past misdeeds is good in and of itself.
4) Descending on anyone who attempts to apologize without crediting them for the apology only emboldens those who will see the outcome and conclude that it is better not to apologize at all.
5) Thinkers are not gods or saints and should not be treated as such .
6) There is no "original sin". It is good for you to hold ideas that are good for the world. If someone else who holds ideas that are good for the world also holds ideas that are bad for the world, you are not responsible and should not feel guilty about it.
7) It is unproductive to think of "intellectual leaders", ther status, and their reputation in general. Intellectual leaders gain their status from pushing ideas that you find convincing. It is the ideas that you should focus on. Your duty is to keep and promote the ones that work, get rid of the ones that don't.
8) Because people hold a mi... (read more)
EA has become big enough to weather such a storm, especially since stemming against the ever-increasing number of calls for struggle sessions should become a central EA cause area. People need to be able to speak their minds and be judged by their deeds and impact, not by some demonstrably uncharitable reading of their thoughts. The latter seems motivated mainly by an attempt to stir up a denunciation rally, likely because somebody didn't like the intellectual content Bostrom is known for.
It is also important to keep adult public discourse policing sane en... (read more)
Bostrom is claiming the right to not engage with a subject, and this right is a necessity to have an inclusive EA-movement.
I downvoted this post because it is propagating a norm that I expect to be damaging.
... (read more)
- Totally agree that this "has real implications for the movement's future, including selection effects on people who may become more uncertain about the views that intellectual leaders of the EA/longtermism movement hold (and by extension, its intellectual foundations), whether EA is a community for 'people like them', and whether EA is a movement that is well-equipped to preserve a future for all of humanity." I'm worried that this statement will deter people from collaborating with or joining AI labs, for example.
- I think this could have been
I didn't downvote, but I think it's bad to focus on the optical concerns over the object level concerns about the statement. "This statement if bad because it might reduce engagement with EA" may be true but it's probably not the first order badness, just like the first order badness of FTX was fraud and not the harms to EA.
While I agree that these kinds of "bad EA optics" posts are generally unproductive and it makes sense for them to get downvoted, I'm surprised that this specific one isn't getting more upvoted? Unlike most links to hit pieces and criticisms of EA, this post actually contains new information that has changed my perception of EA and EA leadership.
If the post had just been making the forum readers aware of the controversy and added some commentary along the lines "this was really hard to read and was really disappointing," then I would've upvoted it (I see it the same way.)
However, the OP also highlights a particular paragraph in the apology (about the cause of differences in group averages) and implies that Bostrom's uncertainty about it and his statement "and I don’t have any particular interest in the question" means that he holds morally repugnant views or at least doesn't sufficiently condemn them.... (read more)
I explained here why I don't agree with this. To quote from that comment:
Even in the original email, Bostrom makes clear that differences in intelligence do not alter the moral value or human dignity of each person.
For him, as for many, the issues of intelligence and moral worth are distinct; he never claims that black people are worth less, you are ascribing your own notion that IQ=moral worth, and then blaming him for not responding to it.
Thanks for the reply. That makes sense! I feel like Bostrom said a bit more than you describe here to make it clear that he doesn’t hold the view that white people are superior. So, to me, while “I like this comment” seemed like an extremely unfortunate phrasing on his part, the context at least made clear that he liked how the comment is "bold and edgy" rather than liking something about alleged differences between white and black people.
That said, you're right that it's important to make these things really clear in an apology and he could have said more on the topic. Other people have also had negative reactions to the apology (e.g., Habiba here), so maybe I'm in a minority. I read the apology and thought it wasn't bad. I agree it could've been better (e.g., he could have written something like the paragraphs I wrote on how we should be very clear that group averages don't have moral significance) .
(I'm sometimes not sure whether it's good to make apologies really long. If I ever had to apologize for something pretty bad, I'd be tempted to write a very long statement – but that may come across as self-absorbed and overly defensive. It just seems hard to get this right and I feel like Bostrom's apology at least hit a few aspects of what I'd expect an acceptable apology to contain.)
My sense is that many people thought the apology was reasonable. Your comment and Ofer's comment, both of which defend the apology as reasonable, have both 65 Karma which makes them among the most upvoted comments in this whole discussion (both statements are made Jan 13, 11:23 CET). I also think Bostrom's apology is reasonable (needless to say, I share your and Ofer's negative reaction to the original email.)
I think it's much riskier, reputation-wise, to state that one had a positive reaction to the apology than to state that one had a negative one, so we will see more of the latter. I think votes are in this case a more accurate reflection of people's views.
Bostrom appears not to realise that "I hate these bloody [racial slur]s!" is not the only form racism can take. Believing that black people are more stupid than white people is also racist. I am not convinced that he no longer believes this. I am deeply disappointed by his apology (more than I am by his original email).
Had Bostrom left out the paragraph quoted by the OP, the apology would read very differently. In the prior paragraph he wrote:
This, if left to its own, would have stood as a strong statement on equity.
By adding the following paragraph on genetics, Bostrom implies the opposite of his claimed indifference to the genetics of race. &nbs... (read more)
It is surprising to see a community of rationalists so opposed to Bostrom’s original point, which is that object level discussions of potentially uncomfortable truths are best avoided in general company, but useful among rationalists.
I am especially disappointed in Bostrom himself, who seems to hedge on a belief he still clearly believes to be empirically valid.
Genetic differences in intelligence are certainly an impolite discussion, but Bostrom’s original framing coincides with my view; intelligence is not tied to moral worth or human dignity, but, for a ... (read more)