To my eye, a lot of EAs seem to under-appreciate the extent to which your response to a crisis isn't just a reaction to an existing, fixed set of societal norms. The act of choosing a response is the act of creating a norm.

You're helping bring into existence a particular version of EA, a particular version of academia and intellectual life, and a particular world-at-large. This is particularly true insofar as EA is a widely-paid-attention-to influencer or thought leader in intellectual society, which I think it (weakly) is.

It's possible to overestimate your influence, but my impression is that most EAs are currently underestimating it. Hopefully this post can at least bring to your attention the hypothesis that your actions matter for determining the norms going forward, even if you don't currently think you have enough evidence to believe that hypothesis.

If you want the culture to be a certain way, I think it's worth taking the time to flesh out for yourself what the details would look like, and find ways to test whether there are any ways to effect that norm, or to move in the direction that seems best to you.

Anchor more to what you actually think is ethical, to what you think is kind, to what you think is honorable, to what you think is important and worth protecting. If you think your peers aren't living up to your highest principles, then don't give up on your principles.

(And maybe don't give up on your peers, or maybe do, depending on what seems right to you.)

Don't ignore the current set of norms you see; but be wary of willing bad outcomes into being via self-fulfilling prophecies.

Be dissatisfied, if the world doesn't already look like the vision of a good, wholesome, virtuous world you can picture in your head. Because as someone who feels optimistic about EAs' capacities to do what's right, I want to see more people fighting for their personal visions of what that involves.

Comments6
Sorted by Click to highlight new comments since: Today at 9:38 AM

Rob -- I guess it's valid and unobjectionable to say we should try to stay conscious about what norms we're creating, following, and reinforcing, with a view towards how we shape the future culture of EA.

However, it sounds like you might be alluding to a somewhat more specific claim that certain good norms are currently at risk of being compromised or corrupted, and/or  certain bad norms are at risk of spreading.

I wonder if you could spell this out a bit more? I'm not sure what your intended takeaway is, or in what direction you're trying to nudge us (e.g. towards more 'inclusive kindness', or towards more 'epistemic integrity'?).

From my perspective, there's currently a sort of tug-of-war happening between three EA orientations (which already existed, though things like the Bostrom e-mail have inspired people to put them into words and properly attend to them):

  • one that's worried EA risks losing sight of principles like "honesty" and "open discourse" (and maybe "scientific inquiry and scholarship"),
  • one that's worried EA risks losing sight of principles like "basic human compassion" and "care and respect for everyone regardless of life-circumstance" (and potentially also "scientific inquiry and scholarship", with a different model of what the science says),
  • and a more pragmatic "regardless of all that stuff, the PR costs and benefits are clear and those considerations dominate the utilitarian calculus here".

Obviously these aren't mutually exclusive, at least at that level of abstraction. You can simultaneously be worried about PR risk, about EAs' epistemic integrity eroding, and about EAs' interpersonal compassion eroding. But a lot of the current disagreement seems to pass through EAs putting different relative weightings on these three classes of consideration.

On the current margin, I want to encourage more discussion that's coming from the first or second perspective, relative to the third perspective. And I want the third perspective to flesh out its models more, and consider a larger option and hypothesis space — in part just so it's easy to tell when EAs have models of the strategy space that they're drawing on at all. (Versus arguments based more on mental motions like "feel socially unsafe 🢡 generate post-facto arguments for why the socially-safe-feeling local action is also the best consequentialist strategy".)

Rob -- this is an excellent clarification; thank you. I agree that there are these three orientations in some tension within EA, and that it's important to be honest about some of these trade-offs, and their implications for the future social and intellectual and ethical norms in EA. 

I guess I'm mostly in the first camp. 

I'm also sympathetic to taking PR issues seriously -- but I think some EA organizers have sometimes been rather inept and amateurish at handling the PR issues, and aren't working with an accurate model of how PR crises flare up and die down, or of how much to say, and what to say, when the crises at at their peak. 

IMHO, we need to realize that any sufficiently large, powerful, and well-funded movement (like ours) will reliably and frequently experience PR crises, some of which arise from deliberate PR attacks from individuals or organizations hostile to our mission. 

That will be the new normal. We'll have to learn to live with it, and to have the PR crisis response teams in place to deal with it. And we'll need the emotional resilience to accept that not everyone will like, love, or respect what we do -- and that's OK.

I think this is generally true, but might benefit from some examples or more specifics.

I think this is generally false, but might benefit from some examples or more specifics.

(Referring to the OP, not the comment)

My impression is that many people are subconsciously or implicitly aware of this dynamic, and that contributes to the high level of interest in topics or decisions that are likely to set the tone for the future. I think many people are acting in ways that they hope would set the standard precisely because they want the movement to be defined in the ways they act. I don't mean to single out any particular point of view as being motivated in this way, because in my experience most of the views being expressed here are sincerely held and principled.

Nevertheless I applaud your reminder, because I think becoming consciously aware of what is going on could cause people to be more conscientious about defining those norms. It has shades of Nietzsche's "Eternal Recurrence", which I quite enjoy.

Curated and popular this week
Relevant opportunities