The Forum is getting a bit swamped with discussions about Bostrom's email and apology. We’re making this thread where you can discuss the topic.
All other posts on this topic will be marked as “Personal Blog” — people who opt in or have opted into seeing “Personal Blog” posts will see them on the Frontpage, but others won’t; they’ll see them only in Recent Discussion or in All Posts. (If you want to change your "Personal Blog" setting, you can do that by following the instructions here.)
(Please also feel free to give us feedback on this thread and approach. This is the first time we’ve tried this in response to events that dominate Forum discussion. You can give feedback by commenting on the thread, or by reaching out to forum@effectivealtruism.org.)
Please also note that we have received an influx of people creating accounts to cast votes and comments over the past week, and we are aware that people who feel strongly about human biodiversity sometimes vote brigade on sites where the topic is being discussed. Please be aware that voting and discussion about some topics may not be representative of the normal EA Forum user base.
If you choose to participate in this discussion, please remember Forum norms. Chiefly,
- Be kind.
- Stay civil, at the minimum. Don’t sneer or be snarky. In general, assume good faith. We may delete unnecessary rudeness and issue warnings or bans for it.
- Substantive disagreements are fine and expected. Disagreements help us find the truth and are part of healthy communication.
Please try to remember that most people on the Forum are here for collaborative discussions about doing good.
With apologies, I would like to share some rather lengthy comments on the present controversy. My sense is that they likely express a fairly conventional reaction. However, I have not yet seen any commentary that entirely captures this perspective. Before I begin, I perhaps also ought to apologise for my decision to write anonymously. While none of my comments here are terribly exciting, I would like to think, I hope others can still empathise with my aversion to becoming a minor character in a controversy of this variety.
Q: Was the message in question needlessly offensive and deserving of an apology?
Yes, it certainly was. By describing the message as "needlessly offensive," what I mean to say is that, even if Prof. Bostrom was committed to making the same central point that is made in the message, there was simply no need for the point to be made in such an insensitive manner. To put forward an analogy, it would be needlessly offensive to make a point about free speech by placing a swastika on one’s shirt and wearing it around town. This would be a highly insensitive decision, even if the person wearing the swastika did not hold or intend to express any of the views associated wit... (read more)
I came here to write something kind of sloppy, but this is a much more measured and clear thing than I could have written, and I agree with basically all of it (though I think I might have more support for point A than you do, depending on some nuance). I'm also pretty disappointed with CEA's response and have some desire to go around semi-emotionally pointing something like, "this organization clearly does not have truth/integrity as its primary value, you cannot trust it". I'm pretty sad about this; I'm not personally an EA but have many friends in the community and have supported and defended the movement from the sidelines for a long time. While I intend to keep supporting my friends I feel much less inclined to support the organized movement, now.
Let us take a moment of sympathy for the folks at CEA (who are, after all, or allies in the flight to make the world better). Scant weeks ago they were facing harsh criticism for failing to quickly make the conventional statement about the FTX scandal. Now they're facing criticism for doing exactly that. I'm glad I'm not comms director at CEA for sure.
While I do sympathise with them having to handle yet another scandal which most of them had no involvement with, this comment seems to both oversimplify the differences and misrepresent what people were actually asking for post-FTX:
As far as I know, they've still offered no such public introspection.
Agree with the original comment and Aella. I would add that should the university or others decide to take action, this matter would be important enough to stand the ground in favor of Bostrom, also non-anonymously. We can not "choose" our views, as some comments asked for that seems very PC. Also, we should not be held accountable for what we wrote 25+ years ago unless we repeat it.
However, "standing the ground" is precisely the opposite of what is needed, we need calm, well-intended, and measured discussions and I appreciated the blog post by David Thorstad in depth criticizing Bostrom. It is understandable when some (here, Twitter, or elsewhere) are angry, frustrated, or demand changes. Public statements like the one from CEA, however, are likely not helpful (that they don't support the original mail is presumed without their message). Nor to be fair was the rather sloppy apology by Bostrom.
What is not obvious is the next step. I believe Bostrom that he is not interested in continuing this discussion and I do not see a value in forcing him to. Maybe a workshop/red-team white paper having a close and balanced look at this discussion where and if EA and the longtermism movement suffer from racism as alleged and if yes, what can be done about it?
While I think it can make sense to model whole organisations as having traits like 'truth-seeking' or 'having integrity' or 'transparent', particularly when they are small and homogenous, it's always worth remembering that organisations are made up of people, and those people can vary a lot along all those traits. For example, CEA's character could change rapidly after hiring a lot, or if they lose one exceptionally conscientious person, etc.
Brief note on why EA should be careful to remain inclusive & welcoming to neurodiverse people:
As somebody with Aspergers, I'm getting worried that in this recent 'PR crisis', EA is sending some pretty strong signals of intolerance to those of us with various kinds of neurodiversity that can make it hard for us to be 'socially sensitive', to 'read the room', and to 'avoid giving offense'. (I'm not saying that any particular people involved in recent EA controversies are Aspy; just that I've seen a general tendency for EAs to be a little Aspier than other people, which is why I like them and feel at home with them.)
There's an ongoing 'trait war' that's easy to confuse with the Culture War. It's not really about right versus left, or reactionary versus woke. It's more about psychological traits: 'shape rotators' versus 'wordcels', 'Aspies' versus 'normies', systematizers versus empathizers, high decouplers versus low decouplers.
EA has traditionally been an oasis for Aspy systematizers with a high degree of rational compassion, decoupling skills, and quantitative reasoning. One downside of being Aspy is that we occasionally, or even often, say things that normies consid... (read more)
FWIW despite having pretty diametrically opposed views on a lot of these things, I agree that there is something to the issue/divide you reference. It seems correlated with the "normie-EA vs. rationalist-EA" divide I mentioned elsewhere on this page, and I think there are potential tradeoffs from naively responding to the (IMO) real issues at stake on the other side of the ledger. How to non-naively navigate all this seems non-obvious.
Miles. I agree, more or less. It is very tricky to navigate, because EA does include poeple with different personality traits and cognitive styles. These are almost like different Bayesian priors with respect to 'social cause areas', eg the relative importance of being nice vs being empirically accurate. Will ruminate more about all this....
I wholeheartedly agree that EA must remain welcoming to neurodiverse people. Part of how we do that is being graceful and forgiving for people who inadvertantly violate social norms in pursuit of EA goals.
But I worry this specific comment overstates its case by (1) leaving out both the "inadvertent" part and the "in pursuit of EA goals" part, which implies that we ought to be fine with gratuitous norm violation, and (2) incorporating political bias. You say:
I don't want to speak for anyone with autism. However, as best I can tell, this is not at all a universal view. I know multiple peope who thrive in lefty spaces despite seeming (to me at least) like high decouplers. So it seems more plausible to me that this isn't narrowly true about high decouplers in "woke" spaces; it's broadly t... (read more)
AnonymousQualy - You make some valid & thoughtful points. Let me ruminate further about them....
For the moment, I would just question the generality of your claim that 'the really big taboos - like race and intelligence - are usually obvious'. That might be true within a given culture, generation, social class, and ideological context. However, it is often not true across cultures. (For example, when I taught courses for a mainland Chinese university the last couple of years, the students there really wanted to talk about the evolution of race differences and intelligence -- much more than I was comfortable doing.)
If EA aspires to be a global movement, we need to consider the fact that some of our strongest current Anglo-American ideological taboos are not shared by other cultures. And if we impose our taboos on other cultures, we're not really being culturally inclusive.
I addressed these issues in this other 2018 article on 'The cultural diversity case for free speech' (paywalled on Quillette here; free pdf here.)
Thanks a lot for raising this, Geoffrey. A while back I mentioned some personal feelings and possible risks related to the current Western political climate, from one non-Westerner's perspective. You've articulated my intuitions very nicely here and in that article.
From a strategic perspective, it seems to me that if AGI takes longer to develop, the more likely it is that the expected decision-making power would be shared globally. EAs should consider that they might end up in that world and it might not be a good idea to create and enforce easily-violated, non-negotiable demands on issues that we're not prioritizing (e.g. it would be quite bad if a Western EA ended up repeatedly reprimanding a potential Chinese collaborator simply because the latter speaks bluntly from the perspective of the former). To be clear, China has some of this as well (mostly relating to its geopolitical history) and I think feeling less strongly about those issues could be beneficial.
Regarding your point (2), I can see both sides of this.
I agree that some cultural norms are generally better, by most metrics, in terms of human flourishing, social cohesion, progress, prosperity, freedom, resilience, longevity, etc. -- although there are almost always tradeoffs, exceptions, and complications that warrant considerable epistemic humility and ethical uncertainty.
My heuristic is that members of Anglo-American cultures should usually err on the side of listening more than preaching when interacting with people from other cultures who probably know much more about our culture (e.g. through US/UK movies, TV, social media, global news) than we know about theirs.
For what it's worth, I am autistic-and a white man as it happens-and I do not find free, uncensored discussion of race/IQ stuff specifically makes me feel more welcome and comfortable. Rather, it makes me feel sick, and anxious and worried that I am associating with bad people if I participate in the discussion. (I actually agree with Geoffrey Miller that there is probably some connection between autism and what he's calling "decoupling" though. And that this probably makes EA more welcoming for autistics. But ultimately, the point of EA is meant to be to do good, not to be a social club for autistic people.)
EDIT: I also strongly second AnonymousQualy's point that Bostrom knew he was saying taboo stuff and so it cannot be dismissed as 'autistic people don't know when they are offending others'.
I think censorship would be a bad choice here, because the EA forum hasn't discussed these concepts previously (in any routine way, I'm sure there is a screed or two that could be dug up from a mound of downvotes) and is unlikely to in the future.
I would agree that race/IQ debates on the EA forum are unlikely to produce anything of value. But it's my experience that if you have free discussion rights and one banned topic, that causes more issues than just letting people say their piece and move on.
I'd also agree that EA isn't meant to be a social club for autists - but from a cynical perspective, the blithely curious and alien-brained are also a strategic resource and snubbing them should be avoided when possible.
If people are still sharing takes on race/IQ two weeks from now, I think that would be a measurable enough detraction from the goal of the forum to support the admins telling them to take it elsewhere. But I would be surprised if it were an issue.
Uncontroversial take: EA wouldn't exist without the blithely curious and alien-brained.
More controversially: I've been increasingly feeling like I'm on a forum where people think the autistic/decoupler/rationalist cluster did their part and now should just... go away. Like, 'thanks for pointing us at the moral horrors and the world-ending catastrophe, I'll bear them in mind, now please stop annoying me.'
But it is not obvious to me that the alien-brained have noticed everything useful that they are going to notice, and done all the work that they will do, such that it is safe to discard them.
Let me say this: autism runs in my family, including two of my first cousins. I think that neurodivergence is not only nothing to be ashamed of, and not an "illness" to be "cured", but in fact a profound gift, and one which allows neurodivergent individuals to see what many of us do not. (Another example: Listen to Vikingur Olafsson play the piano! Nobody else hears Mozart like that.).
Neurodivergent individuals and high decouplers should not be chased out of effective altruism or any other movement. Doing this would not only be intrinsically wrong, but would also deprive the movement of profoundly important insights, and would deprive the neurodivergent of one of the few places where they can genuinely belong.
It is very important to recognize that neurodivergent individuals, among others, sometimes have a harder time recognizing violations of social norms, and to exercise some degree of patience and compassion in responding to norm violations.
It is also important for everyone, no matter their tendency towards decoupling, their neurodiversity, or their background, to understand that words can harm, and to be sensitive to the need to stop and reverse course when presented with credib... (read more)
Seeing the discussion play out here lately, and in parallel seeing the topic either not be brought up or be totally censored on LessWrong, has made the following more clear to me:
A huge fraction of the EA community's reputational issues, DEI shortcomings, and internal strife stem from its proximity to/overlap with the rationalist community.
Generalizing a lot, it seems that "normie EAs" (IMO correctly) see glaring problems with Bostrom's statement and want this incident to serve as a teachable moment so the community can improve in some of the respects above, and "rationalist-EAs" want to debate race and IQ (or think that the issue is so minor/"wokeness-run-amok-y" that it should be ignored or censored). This predictably leads to conflict.
(I am sure many will take issue with this, but I suspect it will ring true/help clarify things for some, and if this isn't the time/place to discuss it, I don't know when/where that would be)
[Edit: I elaborated on various aspects of my views in the comments, though one could potentially agree with this comment/not all the below etc.]
There's definitely no censorship of the topic on LessWrong. Obviously I don't know for sure why discussion is sparse, but my guess is that people mostly (and, in my opinion, correctly) don't think it's a particularly interesting or fruitful topic to discuss on LessWrong, or that the degree to which it's an interesting subject is significantly outweighed by mindkilling effects.
Edit: with respect to the rest of the comment, I disagree that rationalists are especially interested in object-level discussion of the subjects, but probably are much more likely to disapprove of the idea that discussion of the subject should be verboten.
I think the framing where Bostrom's apology is a subject which has to be deliberately ignored is mistaken. Your prior for whether something sees active discussion on LessWrong is that it doesn't, because most things don't, unless there's a specific reason you'd expect it to be of interest to the users there. I admit I haven't seen a compelling argument for there being a teachable moment here, except the obvious "don't do something like that in the first place", and perhaps "have a few people read over your apology with a critical eye before posting it" (assuming that didn't in fact happen). I'm sure you could find a way to tie those in to the practice of rationality, but it's a bit of a stretch.
Thanks for clarifying on the censorship point!
I do think it's pretty surprising and in-need-of-an-explanation that it isn't being discussed (much?) on LW - LW and EA Forum are often pretty correlated in terms of covering big "[EA/rationalist/longtermist] community news" like developments in AI, controversies related to famous people in one or more of those groups, etc. And it's hard to think of more than 1-2 people who are bigger deals in those communities than Bostrom (at most, arguably it's zero). So him being "cancelled" (something that's being covered in mainstream media) seems like a pretty obvious thing to discuss.
To be clear, I am not suggesting any malicious intent (e.g. "burying" something for reputational purposes), and I probably shouldn't have used the word censorship. If that's not what's going on, then yes, it's probably just that most LWers think it's no big deal. But that does line up with my view that there is a huge rationalist-EA vs. normie-EA divide, which I think people could agree with even if they lean more towards the other side of the divide than me.
LessWrong in-general is much less centered around personalities and individuals, and more centered around ideas. Eliezer is a bit of an outlier here, but even then, I don't think personality-drama around Eliezer could even raise to the level of prominence that personality-drama tends to have on the EA Forum.
I don't find this explanation convincing fwiw. Eliezer is an incredible case of hero-worship - it's become the norm to just link to jargon he created as though it's enough to settle an argument. The closest thing we have here is Will, and most EAs seem to favour him for his character rather than necessarily agreeing with his views - let alone linking to his posts like they were scripture.
Other than the two of them, I wouldn't say there's much discussion of personalities and individuals on either forum.
I think that you misunderstand why people link to things.
If someone didn't get why I feel morally obligated to help people who live in distant countries, I would likely link them to Singer's drowning child thought experiment. Either during my explanation of how I feel, or in lieu of one if I were busy.
This is not because I hero-worship Singer. This is not because I think his posts are scripture. This is because I broadly agree with the specific thing he said which I am linking, and he put it well, and he put it first, and there isn't a lot of point of duplicating that effort. If after reading you disagree, that's fine, I can be convinced. The argument can continue as long as it doesn't continue for reasons that are soundly refuted in the thing I just linked.
I link people to things pretty frequently in casual conversation. A lot of the time, I link them to something posted to the EA Forum or LessWrong. A lot of the time, it's something written by Eliezer Yudkowsky. This isn't because I hero-worship him, or that I think linking to so... (read more)
Here are the last four things I remember seeing linked as supporting evidence in casual conversation on the EA forum, in no particular order:
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=HebnLpj2pqyctd72F - link to Scott Alexander, "We have to stop it with the pointless infighting or it's all we will end up doing," is 'do x'-y if anything is. (It also sounds like a perfectly reasonable thing to say and a perfectly reasonable way to say it.)
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=SCfBodrdQYZBA6RBy - separate links to Scott Alexander and Eliezer Yudkowsky, neither of which seem very 'do x'-y to me.
https://forum.effectivealtruism.org/posts/irhgjSgvocfrwnzRz/?commentId=NF9YQfrDGPcH6wYCb - link to Scott Alexander, seems somewhat though not extremely 'do x'-y to me. Also seems like a perfectly reasonable thing to say and I stand by saying it.
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=x5zqnevWR8MQHqqvd - link to Duncan Sabien, "I care about the lives we can save if we don't rush to conclusions, rush to anger, if we can give each other the benefit of the doubt for five freaking minutes and consider wh... (read more)
Also, I would add at the very least Gwern (which might be relevant to note regarding the current topic) and Scott Alexander as other two clear cases of "personalities" in LW
I agree that there are of course individual people that are trusted and that have a reputation within the community, but the frequency of conversations around Scott Alexander's personality, or his reputation, or his net-effect on the world, is much rarer on LW than it is on the EA Forum, as far as I can tell.
Like, when was actually the last thread on LW about drama caused by a specific organization or individual? In my mind almost all of that tends to congregate on the EA Forum.
My guess is that you see this more in EA because the stakes are higher for EAs. There's much more of a sense that people here are contributing to the establishment and continuation of a movement, the movement is often core to people's identities (it's why they do what they do, live where they live, etc), and 'drama' can have consequences on the progress of work people care a lot about. Few people are here just for the interesting ideas.
While LW does have a bit of a corresponding rationality movement I think it's weaker or less central on all of these angles.
Yep, I agree, that's a big part of it.
I think Jeff is right, but I would go so far to say the hero worship on LW is so strong that there's also a selection effect - if you don't find Eliezer and co convincing, you won't spend time on a forum that treats them with such reverence (this at least is part of why I've never spent much time there, despite being a cold calculating Vulcan type).
Re drama around organisations, there are way more orgs which one might consider EA than which one might consider rationalist, so there's just more available lightning rods.
I for one probably wouldn't have brought it up on LessWrong because it seems like a tempest in a teapot. What is there to say? Someone who is clearly not racist accidentally said something that sounds pretty racist, decades ago, and then apologized profusely. Normally this would be standard CW stuff, except for the connection to EA. The most notable thing — scary thing — is how some people on this forum seem to be saying something like "Nick is a bad person, his apology is not acceptable, and it's awful that not everyone is on board with my interpretation" ("agreed", whispers the downvote brigade in a long series of -1s against dissenters.) If I bring this up as a metadiscussion on LW, would others understand this sentiment better than me?
I suspect that the neurotypicals most able to explain it to weirdos like me are more likely to be here than there. Since you said that
I assume you mean the apology, and I would be grateful if you would explain what these glaring problems are. [edit: also, upon reflection maybe it's not a nuerodiverse vs neurotypical divide, but something else such as political thinking ... (read more)
Registering strong disagreement with this characterisation. Nick has done vanishingly little to apologise, both now and in 1997. In the original emails and the latest apology, he has done less to distance himself from racism than to endorse it.
In what ways do you think the 2023 message endorses racism? Is there a particular quote or feature of it that stands out to you?
The apology contains an emphatic condemnation of the use of a racist slur:
The 1996 email was part of a discussion of offensive communication styles. It included a heavily contested and controversial claim about group intelligence, which I will not repeat here. [1] Claims like these have been made by racist groups in the past, and an interest in such claims correlates with racist views. But there is not a strict correlation here: expressing or studying such claims does not entail you have racist values or motivations.
In general I see genetic disparity as one of the biggest underlying causes of inequality and injustice.... (read more)
Yes, I agree that there's a non-trivial divide in attitude. I don't think the difference in discussion is surprising, at least based on a similar pattern observed with the response to FTX. From a quick search and look at the tag, there were on the order of 10 top-level posts on the subject on LW. There are 151 posts under the FTX collapse tag on the EA forum, and possibly more untagged.
I very much agree with your analysis, except for the "IMO correctly". Firstly, because I hold the views of a "rationalist-EA", so it is to be expected following your argument. Secondly, because we should not hold emails/posts against people 25+ years later, unless they are continued and/or deeply relevant to their points today. Looking at his last publications, they do not seem that relevant.
However, I would like to point out that to me the benefits of EA also profit from the rationality influx. EA to me is "rationality applied to doing good". So the overlap is part of the deal.
This is inaccurate as stated, but there is an important truth nearby. The apparent negatives you attribute to "rationalist" EAs are also true of non-rationalist old-timers in EA, who trend slightly non-woke, while also keeping arms length from the rationalists. SBF himself was not particularly rationalist, for example. What seems to attract scandals is people being consequentialist, ambitious, and intense, which are possible features of rationalists and non-rationalists alike.
Happy to comment on this, though I'll add a few caveats first:
- My views on priorities among the below are very unstable
- None of this is intended to imply/attribute malice or to demonize all rationalists ("many of my best friends/colleagues are rationalists"), or to imply that there aren't some upsides to the communities' overlap
- I am not sure what "institutional EA" should be doing about all this
- Since some of these are complex topics and ideally I'd want to cite lots of sources etc. in a detailed positive statement on them, I am using the "things to think about" framing. But hopefully this gives some flavor of my actual perspective while also pointing in fruitful directions for open-ended reflection.
- I may be able to follow up on specific clarifying Qs though also am not sure how closely I'll follow replies, so try to get in touch with me offline if you're interested in further discussion.
- The upvoted comment is pretty long and I don't really want to get into line-by-line discussion of specific agreements/disagreements, so will focus on sharing my own model.
Those caveats aside, I think some things that EA-rationalists might want to think about in light of recent events... (read more)
I'll limit myself to one (multi-part) follow-up question for now —
Suppose someone in our community decides not to defer to the claimed "scientific consensus" on this issue (which I've seen claimed both ways), and looks into the matter themselves, and, for whatever reason, comes to the opposite conclusion that you do. What advice would you have for this person?
I think this is a relevant question because, based in part on comments and votes, I get the impression that a significant number of people in our community are in this position (maybe more so on the rationalist side?).
Let's assume they try to distinguish between the two senses of "racism" that you mention, and try to treat all people respectfully and fairly. They don't make a point of trumpeting their conclusion, since it's not likely to make people feel good, and is generally not very helpful since we interact with individuals rather than distributions, as you say.
Let's say they also try to examine their own biases and take into account how that might have influenced how they interpreted various claims and pieces of data. But after doing that, their honest assessment is still the same.
Beyond not broadcasting their view, and trying to treat people fairly and respectfully, would you say that they should go further, and pretend not to have reached the conclusion that they did, if it ever comes up?
Would you have any other advice for them, other than maybe something like, "Check your work again. You must have made a mistake. There's an error in your thinking somewhere."?
I would have to think more on this to have a super confident reply. See also my point in response to Geoffrey Miller elsewhere here--there are lots of considerations at play.
One view I hold, though, is something like "the optimal amount of self-censorship, by which I mean not always saying things that you think are true/useful, in part because you're considering the [personal/community-level] social implications thereof, is non-zero." We can of course disagree on the precise amount/contexts for this, and sometimes it can go too far. And by definition in all such cases you will think you are right and others wrong, so there is a cost. But I don't think it is automatically/definitionally bad for people to do that to some extent, and indeed much of progress on issues like civil rights, gay rights etc. in the US has resulted in large part from actions getting ahead of beliefs among people who didn't "get it" yet, with cultural/ideological change gradually following with generational replacement, pop culture changes, etc. Obviously people rarely think that they are in the wrong, but it's hard to be sure, and I don't think we [the world, EA] should be aiming for a cultur... (read more)
Is it okay if I give my personal perspective on those questions?
- I suppose I should first state that I don't expect that skin color has any effect on IQ whatsoever, and so on. But ... I feel like the controversy in this case (among EAs) isn't about whether one believes that or not [as EAs never express that belief AFAIK], but rather it is about whether one should do things like (i) reach a firm conclusion based purely on moral reasoning (or something like that), and (ii) attack people who gather evidence on the topic, just learn and comment about the topic, or even don't learn much about the topic but commit the sin of not reaching the "right" conclusion within their state of ignorance.
- My impression is that there is no scientific consensus on this question, so we cannot defer to it. Also, doesn't the rationalist community in general, and EA-rationalists in particular, accept the consensus on most topics such as global warming, vaccine safety, homeopathy, nuclear power, and evolution? I wonder if you are seeing the tolerance of skepticism on LW or the relative tolerance of certain ideas/claims and thinking the tolerance is problematic. But maybe I am mistaken about whether the typica
... (read more)My view is that the rationalist community deeply values the virtue of epistemic integrity at all costs, and of accurately expressing your opinion regardless of social acceptability.
The EA community is focused on approximately maximising consequentialist impact.
Rationalist EAs should recognise when theses virtue of epistemic integrity and epistemic accuracy are in conflict with maximising consequentialist impact, via direct, unintended consequences of expressing your opinions, or via effects on EA's reputation.
For what it's worth, I have my commitment to honesty primarily for consequentialist reasons.
I would say it's less about rationalists vs non-rationalists and more that people who are inclined to social justice norms (who tend not to be rationalists, though one can be both or neither) think it's a big deal and people who aren't are at least less committal.
Note that there is now at least one post on LW front page that is at least indirectly about the Bostrom stuff. I am not sure if it was there before and I missed it or what.
And others' comments have updated me a bit towards the forum vs. forum difference being less surprising.
I still think there is something like the above going on, though, as shown by the kinds of views being expressed + who's expressing them just on EA Forum, and on social media.
But I probably should have left LW out of my "argument" since I'm less familiar with typical patterns/norms there.
The indirectness is also quite relevant to that. On LessWrong it's pretty encouraged to take current events and try to extract generalizable lessons from them, and make statements that are removed from the local political landscape. I am glad that post was written, and would have been happy about it independently of any Bostrom stuff going on.
https://www.lesswrong.com/posts/GqD9ZKeAbNWDqy8Mz/a-general-comment-on-discussions-of-genetic-group
I liked CEA's statement
- Writing statements like this is really hard. It's the equivalent of writing one tweet on something that you know everyone is gonna rip to pieces. I think there are tradeoffs here that people on the forum don't seem to acknowledge. I am very confident (90%) that a page length discussion of this would have been worse in terms of outcomes.
- I don't think it was for us - I think it was for journalists etc. And I think it performed its job of EA not being dragged into all of this. Note how much better it was than either Anders' statement or Bostrom's - no one externally is discussing it, and in an adversarial environment that means it's succeeded.
- I think it was an acceptable level of accuracy. It's very hard to write short things, but does EA roughly hold that all people are equal? Yes I think that's not a bad 4 word summary. I think a better summary is "the value of beings doesn't change based on their position in space or time and I reject the many heuristics humanity has used to narrow concern which have led to the suffering we see today - racism, sexism, speciesism, etc". I think that while more precise that phrase isn't that much more accurate and is wors
... (read more)The first statement would be viewed positively by most, the second would get a raised eyebrow and a "And what of it?", the third is on thin fucking ice, and the fourth is utterly unspeakable.
2-4 aren't all that different in terms of fact-statements, except that IQ ≠ intelligence, so some accuracy is lost moving to the last. It's just that the first makes it clear which side the speaker is on, the second states an empiricism and the next two look like they're... attacking black people, I think?
I would consider the fourth a harmful gloss - but it doesn't state that there is a genetic component to IQ, that's only in the reader's eye. This makes sense in the context of Bostrom posing outrageous but Arguably Technically True things to inflame the reader's eye.
I think people woul... (read more)
Bostrom was essentially still a kid (age ~23) when he wrote the 1996 email. What effect does it have on kids' psychology to think that any dumb thing they've ever said online can and will be used against them in the court of public opinion for the rest of their lives? Given that Bostrom wasn't currently spreading racist views or trying to harm minorities, it's not as though it was important to stop him from doing ongoing harm. So the main justification for socially punishing him would be to create a chilling effect against people daring to spout off flippantly worded opinions going forward. There are some benefits to intimidating people away from saying dumb things, but there are also serious costs, which I think are probably underestimated by those expressing strong outrage.
Of course, there are also potentially huge costs to flippant and crass discussion of minorities. My point is that the stakes are high in both directions, and it's very non-obvious where the right balance to strike is. Personally I suspect the pendulum is quite a bit too far in the direction of trying to ruin people's lives for idiotic stuff they said as kids, but other smart people seem to disagree.
As some othe... (read more)
Many progressive institutions spend a great deal of time highlighting racial differences. I really wish they would not. Even worse, they go on to attribute these gaps to discrimination and nefariousness on the part of oppressor groups. If gaps are not due to discrimination, then it is immoral to place blame on a the designated oppressor group for discrimination. In other contexts, this is common sense. It is wrong to attribute Jewish success to coordinated conspiracies and exploitation because their success is largely attributable to higher average cognitive ability and intellectual culture.
There are successful minority groups throughout the world who are resented because their higher socioeconomic status is attributed to exploitation. I think this is an unfortunate situation. If anything, attributing socioeconomic outcomes to exploitation leaves a group open to violence moreso than attributing socioeconomic gaps to average cognitive ability differences.
Few people think it is moral to commit acts of violence against less intelligent people. Even fewer probably think it is acceptable to commit acts of violence against a group because they are a member of a groups with a ... (read more)
I plausibly agree. There are times and places to bring up racism and sexism, their historical contexts, and instances where they still exist today. But I also get the sense that people would generally be happier (plausibly even many minorities(?), though I'm not at all sure about that) if they ruminated on these ideas less often. Rumination can both exacerbate the pain of actual injustices and make one perceive injustices where they may not actually exist or don't exist much (manspreading, Shirtgate, etc). Note that this point can also apply to anti-woke people: focusing a bit less on the perceived wrongs of cancel culture might make them happier.
Believers in genetic racial IQ gaps often say their viewpoint is needed in cases like affirmative action, to show that it's not necessarily discriminatory if the demographic composition of some elite group doesn't match the demographic composition of the whole population. But if we were more race-blind and didn't think much about demographic composition to... (read more)
Some notes on the last paragraph in my above comment:
When I used the phrase "SJWs", I intended it to have either neutral valence or a valence of friendly teasing. I agree with some amount of the SJW agenda myself. However, Wikipedia says that since 2011, the term is primarily used as an insult and is associated with the alt-right, which was not an implication I had in mind. Like Bostrom's 1996 email and 2023 apology, this example is an illustration that it can be difficult to realize exactly how a given word or statement may be perceived, especially if people are reading it as if it were a dog whistle.
Part of my reason for using the term "SJW" was that I didn't want to say merely "leftist" or "progressive". I was a strong leftist and progressive in the early aughts, and back then, people with that ideology were, in my experience, generally more focused on trying to improve people's welfare via economic and other government-level policy. Progressives didn't spend as much time as they do now on shaming individual people or groups. I think the woke-ward shift of the last decade, while it raises some important issues that were less highlighted in the past, is plausibly overall less use... (read more)
Great comments, Brian. You should spend more time on the Forum!
Thanks. :) I feel somewhat bad about spending time on this topic rather than my usual focus areas, especially since many of my points were already made by others. Plus, as I mentioned and as Bostrom learned, anything you say about controversial topics online is fodder for political enemies to take out of context. But I have a (maybe non-utilitarian) impulse to stick up for what I think is right even if some people will dislike me for doing so. (For a time, my top-level comment here had a net agreement of -10 or so. Of course, maybe the downvoters were correct and I'm wrong.)
Bostrom's email was in response to someone who made the point you do here about provocation sometimes making people view things in a new light. The person who Bostrom was responding to advocated saying things in a blunt and shocking manner as a general strategy for communication. Bostrom was saying to them that sometimes, saying things in a blunt and shocking manner does nothing but rile people up.
Interesting! I admit I didn't go and read the original discussion thread, so thanks for that context. To the extent that Bostrom was arguing against being needlessly shocking, he was kind of already making the same point that his critics have been making: don't say needlessly shocking things. He didn't show enough sensitivity/empathy in the process of presenting the example and explaining why it was bad, but he was writing a quick email to friends, not a carefully crafted political announcement intended to be read by thousands of people.
I see. :) I would think people would consider biological differences much more plausible in the gender case than the race case. I've heard several people say that when you're a parent to both a boy and a girl, the differences between them are unmistakeable even in the first ~2 years. I think many American adults at least privately understand that there are big biological differences between the brains of men and women, while most American adults probably expect no non-trivial biological racial brain differences. But yeah, any particular gender difference, such as the language gap, could be mostly or all environmental.
Fair enough. :) Some headlines called the FTX leadership "a gang of kids", which I think isn't unreasonable, even though they were in their late 20s or early 30s. The main thing I wanted to convey is that people at this age often have limited life experience or understanding of how the world works and so often do dumb things. Youth is a time to explore weird ideas and make mistakes. Therefore, I would agree that 23-year-olds generally shouldn't be entrusted to hold important decision-making positions unless they've shown a track record of unusual maturity.
It's a great point, and not at all aggressive. :)
I said that 23-year-olds should demonstrate "a track record of unusual maturity" in order to have important positions, not that they should always be denied them. In some cases, such as becoming the president of the USA, a minimum age requirement may make sense because the stakes are so high, although one could say that we should just let voters decide if any given person is qualified.
But you're right that I support a strong prior against, say, tasking a 23-year-old to run a major organization -- a prejudice that needs to be overcome with strong enough evidence of maturity and competence -- in a way that it would be abhorrent to do for a member of a particular racial group.
It's interesting to ponder the reasons for different attitudes toward racism vs ageism. My two main guesses are:
Average differences in traits based on age are sometimes quite large, enough that the value of using the prejucide for making predictions can exceed the unfairness downsides of stereotyping people. For example, my impression is that young men are on average much riskier drivers than older men, so there's not a ton of society-level outrage about chargin
I agree with your comment in general, but I'm not quite sure about this point. I think age-based discrimination has been / is quite severe (though perhaps it is also often justified, since age does make a lot of difference to people's abilities):
- Children are often forced to go sit in a small room all day, subject to the arbitrary whims of a single adult with little oversight, and often have to endure criminal violence from other children with little recourse, in a way that would be unacceptable for older people.
- Young men have been repeatedly conscripted to fight in wars with high mortality rates.
- Old people might face compulsory redundancy.
- Young people have to pay taxes to fund benefits for older people, even if those retirees did not have to pay those taxes when they were young, and these retirement benefits may not be available by the time the young retire.
- Many facilities
... (read more)Good list. :)
I think school is vastly less bad than, say, slavery, with some possible exceptions like if there's extreme bullying at the school.
You're right that the violence children endure from each other (and sometimes from their own parents) would be unacceptable if done to adults. If one adult hits another, that's criminal assault/battery. If a kid hits another kid, that's just Tuesday.
Children are also subject to the arbitrary whims of their parents, and are made to do unpaid labor against their will, though usually parents don't treat their own children extremely badly. (Of course, some parents do horrifically abuse or neglect their children.)
In any case, as you said, to some extent the lack of freedom for children is inevitable. (Actually, there is a way to avoid it entirely: don't have children, which is the antinatalist solution. If sentient beings didn't exist, none of the problems we're discussing here would be problems anymore.)
It's definitely right to look at historical and other social context to explain current and past attitudes towards discrimination as explanations. A utilitarian framework is probably not the right approach, nor most other ethics systems. I doubt there was ever a time in the modern era where attitudes were consistent, and there's loads of social conditioning going on. I don't think many women felt angry in the 19th century when their heads of government were (almost?) invariably men, because "that's just how things are" and nobody else was getting angry about it anyway.
My favorite example of current discrimination that totally flies under the radar of the collective ire is height discrimination. 6 of 46 US presidents[1] have been of below average height, a result this extreme or more has less than a 0.005 chance occuring due to randomness (i.e. your chances of becoming president are 2 orders of magnitudes lower if you are short). This is not totally unknown, occasionally there's a paper or article about height advantages, but people perceive it as a mere curiosity. Personally speaking as a short guy, this absolutely fails to anger me either.
1: https://www.thoughtco.com/shortest-presidents-4144573
Thanks. :) I mainly had in mind something more like wisdom, rather than intelligence. Social norms on particular topics are often not what you would expect by armchair reasoning. In many cases, you have to directly encounter people expressing those norms, or see news stories / hear gossip about people who have run afoul of those norms, to know what they are. Nerds who are very interested in science/math/theoretical things may be less likely to learn about these norms than the average person, despite having high fluid intelligence. (BTW, this is one reason I've updated toward thinking reading some amount of news is important.) I imagine that people told Bostrom that what he said in 1996 wasn't cool, and if so, that was a useful learning experience for him. The only problem was that it was written down for posterity to see.
I think cultural context is also relevant to judging these things. Most young people today (even most nerds) know that what Bostrom said (even though it was in the context of giving an example of what you shouldn't say) would elicit strong negative reactions, given how much media attention these things receive. I assume this was less obvious to nerds in the 1990s (... (read more)
It is perhaps important to note that in the original email, Bostrom quite directly says that he is aware of the social norm about not saying what he said. In fact, that was one of the main points of the email: that saying something true in a blunt manner about a controversial topic is likely to be viewed as offensive. If Bostrom learned anything -- and indeed, he apologized within 24 hours -- it was that saying something like that can be inadvisable even among friends.
In general, I don't think old people generally have a stronger understanding of social norms than younger people. Old people will of course have more experience to draw from, and their mannerisms will have gone through more trial and error. In that sense, I agree: old people are often wiser. But the frontier of cultural norms are generally driven by young people, and... (read more)
Interesting point! I hadn't even heard of "stating one's pronouns while introducing oneself", although maybe that's because I rarely meet anyone in person.
As you said, there's a tension between young people having the cutting edge of norms versus older people knowing a greater quantity of norms, even though some may be stale.
I think the obsession among young people with political correctness increased dramatically in the last 10 years, and it was barely a discussion topic when I was in pre-college school. Usually it seemed to be teachers and administrators trying to inculcate anti-bullying lessons into the students. At the anti-bullying workshops, students often rolled their eyes. So I'm not sure how true it would have been to say that students were at the vanguard of social norms in my school. (I went to a pretty liberal public school in upstate New York.)
I may also be generalizing too much from my own past self, since I was often called "oblivious" at Bostrom's 23-year age and wasn't that well informed about scandals, maybe because I thought they were too gossip-y and not as important as "serious" topics. (Now I realize that gossip is actually very important.)
... (read more)An attempt to express the “this is really bad” position.
These are not my views, but an attempt to describe others.
Imagine I am a person who occasionally experiences racism or who has friends or family for whom that is the case. I want a community I and my friends feel safe in. I want one that shares my values and acts predictably (dare I say, well-aligned). Not one that never challenges me, but one where lines aren’t crossed and if they are, I am safe. Perhaps where people will push back against awful behaviour so I don’t have to feel constantly on guard.
Bostrom’s email was bad and his apology:
And to add to that, rather than the community saying “yes that was bad” a top response is “I stand with Bostrom”. I understand that people might say “trust us, you know we are good and not racist” but maybe I don’t trust them. Or maybe my friends or family are asking me about if I know this Bostrom guy or if he’s part of my community.
And maybe I am worried that Bostrom et al don’t have the interests of people of colour at heart when they think ... (read more)
That populations vary significantly on IQ. Someone I talked to said that they had found correlation between people who held these views and who treated them as if they were less intelligent.
So — this person believes IQ cannot vary significantly by population? Or that one mustn't say so?
In the Flynn effect, populations vary significantly on IQ depending on when they were born. So, assuming the Flynn effect isn't controversial, I suppose you meant "populations grouped by skin color". But, I would ask, if timing of birth is correlated with IQ, then couldn't location of birth be correlated with IQ? Or poverty, or education?
I could continue this line of reasoning, but.. somehow it doesn't feel useful. Positions people take on this can be arbitrarily extreme, e.g. some people object to any attempt to measure intelligence. If such a person sees the Bostrom "apology", they could be mad that he hasn't denounced these so-called "IQ tests" as illegitimate.
And I guess your point wasn't about logic, after all, but about feelings. So let me share my feeling: I find it extremely threatening and scary when people in/around EA — you know, EA, the concept I am building my whole life around — are vaguely treating someone who (to me) is obviously not racist as if he were a racist. It's like suddenly my neighbors joined a mob and are carrying a city councilman toward the giant tree in the... (read more)
You haven't said what "these beliefs" refers to, but given the preceding context, you seem to be strongly objecting not to any belief Bostrom holds, but to his lack of belief. In other words, it is threatening and frightening (in context) that Bostrom said: "It is not my area of expertise, and I don’t have any particular interest in the question. I would leave to others, who have more relevant knowledge, to debate whether or not in addition to environmental factors, epigenetic or genetic factors play any role".
You mention a Wikipedia article that you don't link to directly. I think you mean this one. Perhaps the most notable thing in this article is the following:
I suppose you believe either (1) that it was completely unacceptable that Bostrom did not study up on this topic before wr... (read more)
A good friend sent me an article from Vice on this with the comment below "EA is falling apart laughing emoji" He's a great friend, but to be honest this comment pissed me off, and more than the FTX coverage, this round feels like a sort of particular hate attack on EA.
Think what you want about Bostrom and his comment, digging through the threads of previous online engagements of someone to find some dirt to hopefully hurt them and their associated organizations and acquaintances is personally disgusting to me, and I really hope that we don't engage in similar sort of tactics in response, though I don't think it's a really worry because the general level of decency from EAs at least seems to be higher than the ever lowering bar journalists set.