I'd like to take a moment to mourn what the discourse doesn't have. 

It's unfortunate that we don't trust eachother. 

Buck's comment

There will be no enumeration by me right now (you're encouraged to try in the comments) of the vastly different types of anonymous forum participation. The variance in reasons people have for not committing posts and comments is broad, and I would miss at least one. 

Separately, I'd like to take a moment to mourn the fact that this short note about movement drama can be expected to generate more comments than my effortposts about my actual work can hope to get. 

But I think it's important to point out, for anyone who hasn't noticed yet, that the presence of burner accounts is a signal we're failing at something

Think of how much more this excellent comment of Linch's would have meant if the OP was out and proud.

I would like to say that I feel like a coward when I hold my tongue for reputational considerations, without anyone who's utilized a burner account hearing me and responding with "so you're saying I'm a coward". There are too many reasons out there for people to partake in burner accounts for me to say that. 

I'm normally deeply sympathetic to romantic discussions of the ancient internet values, in which anonymity was a weapon against the biases of status and demographic. I usually lament the identityfication of the internet that comes up around the time of facebook. But there is a grave race to the bottom of integrity standards when we tolerate infringements on anyone's ability - or indeed their inclination - to tell the truth as they see it and own the consequences of standing up and saying it. 

I'm much more saying "if burner account users are correctly or rationally responding to the environment (with respect to whatever risk tolerance they have), then that's a signal to fix the environment" than I am saying "burner account users are not correct or rational". But I think at the margin, some of the burnerified comments I've seen have crossed the line into, I say as I resist a perceptible urge to say behind a burner account, actual cowardice. 

Comments131
Sorted by Click to highlight new comments since: Today at 8:07 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I'm concerned that there's an information cascade going on. That is, some claims were made about people being negatively affected by having posted public criticism; as a result some people made critical posts anonymously; that reinforces the perception that the original claim is true; more people post anonymously; the cycle continues.

But I just roll to disbelieve that people facing bad consequences for posting criticism is a serious problem. I can totally believe that it has happened at some point, but I'd be very surprised if it's widespread. Especially given how mild some of the stuff that's getting anonymously posted is!

So I think there's a risk that we meme ourselves into thinking there's an object level problem when there actually isn't. I would love to know what if any actual examples we have of this happening.

This is true. Another reason I think public fears of professional retaliation are overstated is that "I'm afraid of professional retaliation" is generally taken as a legitimate reason to hide, whereas lots of other fears are not, and so many other fears get justified in terms that will be well-received. Like, if saying "I'm posting anonymously because I'm afraid of being looked at funny" is seen as cowardly but saying "I'm posting anonymously because I'm afraid of professional retaliation" is seen as sympathetic, then I expect both types of people will claim to fear professional retaliation.

(I do think EA institutions have a totally-normal-for-white-collar-professional degree of retaliation for not toeing the line. I just think the discourse here overweights how much of it comes from, like, posts on this forum, whereas all the cases I know about were because of more substantive causes like materially supporting a disfavored institution, or normal bureaucratic power struggles, or something.)

5
RobBensinger
1y
Oh, that makes total sense to me and I now feel silly for not believing this before I read your comment.
6
RobBensinger
1y
Sounds right to me. (At least, this sounds like a big part of the source, though maybe not the biggest part?)
3
JoshuaBlake
1y
Unfortunately all we have is anecdotes and vibes. I don't think there's anyway to really have an informed opinion here.

I find some of the comments here a bit implausible and unrealistic.

What people write online will often affect their reputation, positively or negatively. It may not necessarily mean they, e.g. have no chance of getting an EA job, but there are many other reputational consequences. 

I also don't think that updating one's views of someone based on what they write on the EA Forum is necessarily always wrong (even though there are no doubt many updates that are unfair or unwarranted).

Here's my take for why this might not be a serious issue:

  • It's very common in other forums I'm a part of for literally everyone except me to be various levels of pseudonymous and I find it pretty rare that people use their real name, especially for giving spicy takes that people disagree with. I'm actually pretty pleasantly surprised by the number of people in EA who give spicy takes with their real name attached.

  • Likewise, I think it's pretty natural to want to be able to speak unfiltered without having to worry about how what you say will affect your reputation, either rightly or wrongly. Especially given how bad tail risks to your reputation can be, I understand the fear to use a real name to challenge the status quo and I'm super glad people still have an outlet to give critical spicy feedback. Jeff Kaufman writes about Responsible Transparency Consumption - plus what Holden said - and this is a norm I very much value and I think is higher in EA than elsewhere, but still not a guarantee.

That being said, I am definitely super duper concerned by the number of people who say something literally like "posting anonymously because I don't want to lose out on jobs or tank my ca... (read more)

Larks
1y51
16
1

the view that EA employers are so fragile as to deny job opportunities based on EA Forum hot takes is hopefully greatly exaggerated and very disturbing if not. 

In my personal experience most EA orgs have been extremely tolerant of even harsh criticism. I would guess that criticising CEA, MIRI, CSER, GWWC, LTFF, EVF, CE, OP, GCRI, APPGFG, GAP (and probably others I have forgotten) has been overall positive for me. I don't know of any other movement where leaders are as willing to roll with the punches as they are here and I think they deserve a lot of credit for that. 

On the other hand, ACE did attempt to cancel and defund Anima because of some hot takes on facebook, so maybe things are worse in the animal welfare side of things that I am less familiar with.

I agree with your first paragraph. I disagree about your read of the ACE <> Anima situation, but there's no need to re-litigate that here. Regardless of your feelings about the Anima situation, I do think ACE has been criticized so many times in so many different ways by so many different people and they have been good about it.

6
Ben_West
1y
That link says "Attempting to cancel an animal rights conference speaker because of his views on Black Lives Matter" which I think is different from objecting to criticism about ACE itself?
4[comment deleted]1y

At Rethink Priorities, we don't take people's EA Forum user history into account at all when hiring

This seems incredibly surprising to me. Someone writes the best post in existence on whether mealworms suffer, you are considering hiring them to research invertebrate sentience, and you're like "don't care – your research was posted on the forum so I'm not going to look at it?"

Do you look at anything people have done before working at RP?

Disclaimer: Work at RP, but am not speaking on behalf of RP here or those involved in hiring processes.

I think this is basically accurate for the standard hiring round, depending on what you consider "when hiring". For example, my understanding is that knowledge of an author who has written a post you described would likely contribute to whether RP reaches out to people inviting them to apply for a role (though this bar is much, much lower than whether someone has authored such a post), but these invitation confers no advantage during the hiring process itself, which leans rather heavily on the test task / skills assessment portions of the hiring process (one example here).

 RP also aims to not select for knowledge and experience in EA beyond the extent that it is relevant for the specific role they are applying for (and keeping in mind that much of EA knowledge, like other knowledge, can be learned). My personal impression is that having a process of checking EA forum history, even if this was somehow blinded, would risk biasing this in favour of active EAF users in a way that was not reliably predictive for selecting the best candidate.

I have less insight into the processes t... (read more)

Do you look at anything people have done before working at RP?

Not really. We mainly have people do test tasks. Usually people who have written really good things on the EA Forum also do really well on test tasks anyways.

In the past we have had people submit writing samples and we've graded those. That writing sample could be an EA Forum post. So in that case, a great EA Forum post could directly affect whether you get hired.

Definitely open to this being a bad approach.

The main thing I was trying to get at though is that having downvoted EA Forum comments or participating in some spicy takes doesn't affect RP employment.

2
Ben_West
1y
Interesting, thanks!

It's very common in other forums I'm a part of for literally everyone except me to be various levels of pseudonymous and I find it pretty rare that people use their real name, especially for giving spicy takes that people disagree with. I'm actually pretty pleasantly surprised by the number of people in EA who give spicy takes with their real name attached.

I think there might be an important difference between pseudonymous and burner accounts here. I basically have no problem with consistent pseudonymous identities, whereas I share the feeling that having a bunch of throwaway accounts posting anonymous complaints is kind of bad. 

2
Peter Wildeford
1y
Yeah I think that's right. Though it's hard to know if the burner or throwaway accounts belong to people who otherwise post on the EA Forum under a pseudonymous account - it could be burners created by people who otherwise post under their real name or burners created by people who don't really use the EA Forum account.

because the view that EA employers are so fragile as to deny job opportunities based on EA Forum hot takes is hopefully greatly exaggerated and very disturbing if not.

Theres at least some evidence to suggest these fears are justified. Take the thankfully scrapped "PELTIV " proposal for tracking conference attendees:

Individuals were to be assessed along dimensions such as “integrity” or “strategic judgment” and “acting on own direction,” but also on “being value-aligned,” “IQ,” and “conscientiousness.” Real names, people I knew, were listed as test cases, and attached to them was a dollar sign (with an exchange rate of 13 PELTIV points = 1,000 “pledge equivalents” = 3 million “aligned dollars”).

I don't think it's unreasonable to be worried that if people are being tracked for their opinions at conferences, their forum presence might also be. I'll repeat that this proposal was scrapped, but I get why people would be paranoid.  

Theres also the allegation in the "doing ea better" post that:

Hiring and funding practices often select for highly value-aligned yet inexperienced individuals over outgroup experts.

If this is true, then criticising EA orthodoxy might make you less "value aligned" in the eyes of EA decision makers, and cost you real money. 

Maybe people have started to use "value-aligned" to mean "agrees with everything we say", but the way I understand it it means "_cares_ about the same things as us". Being value-aligned does not mean agreeing with you about your strategy, or anything else much. In fact, someone posting a critical screed about your organization on the EA forum is probably some decent evidence that they are value-aligned: they cared enough to turn up in their spare time and talk about how you could do things better (implicit: to achieve the goal you both share).

There are definitely some criticisms that suggest that you might not be value-aligned, but for most of the ones I can think of it seems kind of legitimate to take them into account. e.g. "Given that you wrote the post 'Why animal suffering is totally irrelevant', why did you apply to work at ACE?"

So, many things that could be said about PELTIV, but I'm not convinced that filtering for value-alignment filters negatively for criticality, if anything I think it's the opposite.

Larks
1y21
10
0

There are definitely some criticisms that suggest that you might not be value-aligned, but for most of the ones I can think of it seems kind of legitimate to take them into account. e.g. "Given that you wrote the post 'Why animal suffering is totally irrelevant', why did you apply to work at ACE?"

Yeah in contrast I would generally expect a post called "Statistical errors and a lack of biological expertise mean ACE have massively over-estimated chicken suffering relative to fish" to be a positive signal, even though it is clearly very critical. 

I agree with you that filtering for alignment is important. The mainstream non-profit space speaks a lot about filtering for "mission fit" and I think that's a similar concept. Obviously it would be hard to run an animal advocacy org with someone chowing down on chicken sandwiches every day for lunch in the organization cafeteria.

But my hot take for the main place I see this go wrong in EA: Some EAs I have talked to, including some quite senior ones, overuse "this person may be longtermist-adjacent and seem to be well-meaning but they just don't give me enough vibes that they're x-risk motivated and no I did not actually ask them about it or press them about this" -> "this is not a person I will work with" as a chain of reasoning, to the point of excluding people with nuanced views on longtermism (or just confused views who could learn and improve) and this makes the longtermist community more insular and worse. I think PELTIV and such give a similar take of making snap judgements from afar without actually checking them against reality (though there are other clear problems also).

My other take about where this goes wrong is less hot and basically amounts to "EA still ignores outside expertise too much because the experts don't give off enough EA vibes". If I recall correctly, nearly all opinions on wild animal welfare in EA had to be thrown out after discussion with relevant experts.

4
quinn
1y
Fortunately this can be fixed by publishing pamphlets with the correct sequences of words helpfully provided, and creating public knowledge that if you're serious about longtermism you just need to whisper the correct sequence of words to the right person at the right time. Jokes aside, there's an actual threat of devolving into applause light factories (I'll omit the rant about how the entire community building enterprise is on thin ice). Indeed, someone at Rethink Priorities once told me they weren't convinced that the hiring process was doing a good job at separating "knows what they're talking about, can reason about the problems we're working on, cares about what we care about" from "ideological passwords, recitation of shibboleths", or that it was one of the things they really wanted to get right and they weren't confident they were getting right. It's not exactly easy.
2
Peter Wildeford
1y
Yeah I certainly don't think our hiring process is perfect at this either. These kinds of concerns weigh on me a lot and we're constantly thinking about how we can get better.
2
Michael_PJ
1y
I haven't seen that but if that's happening then I agree that's bad and we should discourage it!
4
Jason
1y
The article specifically claimed "Low PELTIV value was assigned to applicants who worked to reduce global poverty or mitigate climate change, while the highest value was assigned to those who directly worked for EA organizations or on artificial intelligence." That suggests that a post advocating for a reallocation of effort to the former might be relevant.
1
titotal
1y
I agree that if value-aligned is being used in the sense you are talking about, then it's fine.  The allegations are that it's not being used in that sense. That it's being used to punish people in general for having unorthodox beliefs.  The article I linked states that: This would be completely fine if you were in an AI risk organisation: obviously you mostly want people who believe in the cause. But this is the centre for effective altruism. It's meant to be neutral, but this proposal would have directly penalised people for disagreeing with orthodoxy. 
4
Michael_PJ
1y
It's not clear from the article whether the high PELTIV score came from high value-alignment scores or something else. If anything, it sounds like there was a separate cause-specific modifier (but it's very hard to tell). So I don't think this is much evidence for misuse of "value-aligned".

A few months ago I would have easily agreed with "the view that EA employers are so fragile as to deny job opportunities based on EA Forum hot takes is hopefully greatly exaggerated and very disturbing if not."

However, then I read about the hiring practices at FTX, and significantly updated on this. It's now hard for me to believe that at least some EA employers would not deny job opportunities based on EA forum hot takes!

Where is there info on hiring practices at FTX? I don't remember seeing this and would be interested. 

More generally, I would be really interested in hearing about particular examples of people being denied job opportunities in EA roles because of opinions they share on the EA Forum (this would worry me a lot). 

The only rumor I've heard is that someone was once denied an opportunity because they were deemed not a longtermist and the only way the org could've known the person was not a longtermist was from their public writing, and I personally wasn't sold that strongly holding longtermist values was a key requirement for the position. That being said, I've only heard it from the person who didn't get hired and possible that I may be substantially misunderstanding the situation.

I definitely would like to hear other people's views on this, from burner accounts if need be.

5
ChanaMessinger
1y
Huh, I feel mixed about this. I want there to be ways and places to just talk and not have an all things considered opinions and not be too strongly judged for it (and I know some people hold to a "what's the best thing this person has done/said" standard rather than "what's the quality of the average thing they said"), for epistemics and probably because it's sensible in a bunch of ways, but it would also be confusing to me if people's behavior on the public internet didn't give evidence of the kind of employee they are or their views in ways that might matter? Maybe we're just litigating how much of a grace buffer there should be (where maybe we agree it should be pretty big).
8
Peter Wildeford
1y
Good god I certainly hope any practices (EDIT: meaning specifically the obvious trash fire practices) at FTX are not common at other EA orgs. To be clear, I only really know how Rethink Priorities operates and I have minimal insight into the operations of other groups.

Good god I certainly hope any practices at FTX are not common at other EA orgs.

FTX seems to have been a trash fire in many different respects at once, but the above sentence seems super hyperbolic (you hope zero practices at FTX are common at EA orgs??), and I don't know what the non-hyperbolic version of it in your mind is.

I'm somewhat wary of revisionist history to make it sound like FTX was more wildly disjoint from EA culture or social networks than it in fact was, at least in the absence of concrete details about what it was actually like to work there.

2
Peter Wildeford
1y
Yes, my statement was intentionally hyperbolic. I definitely did not mean to say that there are absolutely zero practices at FTX that I like, nor did I mean to suggest that FTX is disjoint from EA culture (though I know so little about what FTX was like or what EA culture is like outside of RP that it is hard for me to say).

The base rate I have in mind is that FTX had access to a gusher of easy money, run by young energetic people with minimal oversight and a limited usage of formalized hiring systems. That produced a situation where top management's opinion was the critical factor in who got promoted or hired into influential positions. The more that other EA organizations resemble FTX, the stronger I would think this.

3
Peter Wildeford
1y
I suspect "easy money" is an important risk factor for "top management's opinion [is] the critical factor in who got promoted or hired into influential positions" but it certainly doesn't have to be the case!
2
quinn
1y
do you mean ftx the exchange or ftx the future fund? 
5
projectionconfusion
1y
This is not just a question of the attitude of EA employers but of wider society. I have been involved in EA for a long time but now work in a professional role where reputation is a concern, so do all my online activity pseudonymously.  I would dislike it if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs". And that would probably be bad for the variety of perspectives and expertise in EA discussions. 

I have been using a burner account recently as opposed to my account with my real name following the Bostrom controversy. That decision is not motivated by any fear of reprisal within EA communities; at my local EA group, I am perfectly happy to espouse the beliefs that I'd want anonymity for on here.

The reasons for doing so are as follows:

Potential Costs: EA seems to be under a microscope in the current landscape (See Bostrom, FTX, recent Time article on SA). This forum is not viewed only by people with beliefs in charitable understanding and respect for evidence-driven conclusions. If I say something "controversial" to an EA and provide sufficient evidence, I have much more confidence that, even if they disagree, they will be understanding of my thought process. I have no fear of social costs among EAs; not so with journalists trawling for someone to quote as a "eugenicist" or other pejorative. This and showing up as a Google result could have severe costs to my reputation and life outcomes.

Lack of Potential Benefits: Because of EAs high decoupling norms, I don't think that attaching my real name to posts provides much marginal value. In my experience and viewing others', people ... (read more)

Buck
1y34
14
8

Holden Karnofsky on evaluating people based on public discourse:

I think it's good and important to form views about people's strengths, weaknesses, values and character. However, I am generally against forming negative views of people (on any of these dimensions) based on seemingly incorrect, poorly reasoned, or seemingly bad-values-driven public statements. When a public statement is not misleading or tangibly harmful, I generally am open to treating it as a positive update on the person making the statement, but not to treating it as worse news about them than if they had simply said nothing.

The basic reasons for this attitude are:

  • I think it is very easy to be wrong about the implications of someone's public statement. It could be that their statement was poorly expressed, or aimed at another audience; that the reader is failing to understand subtleties of it; or that the statement is in fact wrong, but that it merely reflects that the person who made it hasn't been sufficiently reflective or knowledgeable on the topic yet (and could become so later).
  • I think public discourse would be less costly and more productive for everyone if the attitude I take were more common. I think tha
... (read more)

Buck seems to be consistently missing the point. 

Although leaders may say "I won't judge or punish you if you disagree with me", listeners are probably correct to interpret that as cheap talk. We have abundant evidence from society and history that those in positions of power can and do act against them. A few remarks to the contrary should not convince people they are not at risk.

Someone who genuinely wanted to be open to criticism would recognise and address the fears people have about speaking up. Buck's comment of "the fact that people want to hide their identities is not strong evidence they need to" struck me as highly dismissive. If people do fear something, saying"well, you shouldn't be scared" doesn't generally make them less scared, but it does convey that you don't care about them - you won't expend effort to address their fears. 

Although leaders may say "I won't judge or punish you if you disagree with me", listeners are probably correct to interpret that as cheap talk.

GiveWell liked your criticism so much they literally started a contest to get more like it and gave you tens of thousands of dollars. 

I'm trying to read your comment charitably but come on. Saying this quote is "cheap talk" when you've personally profited from it not being cheap talk is unfair to the point of being actively deceptive. 

6
Yellow
1y
This is a conflation of technical criticism (e.g. you critique a methodology or offer scientific evidence to the contrary) and office politics criticism (e.g. you point out a conflict of interest or question a power dynamic) Plant made a technical criticism, whereas office politics disagreement is the one that potentially carries social repercussions. Besides, ea orgs aren't the only party that matters- the media reads this forum too, i can see how someone might not want a workplace conflict to become their top Google result.
4
MichaelPlant
1y
Hmm. I guess I was thinking about this in general, rather than my own case. That said, I don't think there's any contradiction between there being visible financial prizes for criticism and for people to still rationally think that (some form of) criticise will get you in trouble. Costly signals may reduce fears, but that doesn't mean they will remove or reverse them. Seems worth noting that there has just been a big EA critiques prize and people are presently using burner accounts.

Buck's comment of "the fact that people want to hide their identities is not strong evidence they need to" struck me as highly dismissive. If people do fear something, saying"well, you shouldn't be scared" doesn't generally make them less scared, but it does convey that you don't care about them - you won't expend effort to address their fears.

But Buck wasn't saying you shouldn't be scared? He was just saying that high burner count isn't much evidence for this.

Precisely, I think he was claiming that p(lots of burners | hiding identity is important) and p(lots of burners | hiding identity isn't important) are pretty close.

I interpreted this as a pretty decoupled claim. (I do think a disclaimer might have been good.)

Now, this second comment (which is the root comment here) does try to argue you shouldn't be worried, at least from Holden and somewhat from buck.

6
quinn
1y
Any ideas for more costly signals? One failure mode, the one of "contrarian connie has a take that no one thinks is very good but it is such a thorough and comprehensive take that it would look virtuous of us to hire a critic", seems broadly bad. Corollary of the badness of this failure mode, from the connie's perspective, is there may not be a principled classifier that sorts out "disagrees with my take" from "retaliation for challenging the status quo". 
5
Ivy Mazzola
1y
[EDIT: I'm getting disagrees and I'd really appreciate if people could explain how I'm wrong that posting controversial things under a real name is better, in expected value terms? Likelihood of pros vs likelihood of cons? Or tell me which other piece you disagree with?] In that case, here's a conflicting claim to the contrary, which I believe it is easy to find evidence of: We are in a social movement where you get social status for being critical, attempting to solve problems proactively, and going against the grain, you get extra status for doing it bravely and publicly (as opposed to in the backrooms or something), you also get (heaps of) social status for admitting you were wrong and redacting your claim, and you get points for doing conversation well. So, here are 4 scenarios I see (which again I'm not collecting evidence of but I believe it is there for all to see): 1. If you use your real name to write a criticism and it is well received, that's a win. 2. If you use your real name to post a criticism and it is not well recieved, and you are convinced you were wrong, you can post a redaction, or do both edit the top of your post and add a few comments saying commentors were right. You can also DM people thanks for changing your mind. You will get points for epistemic humility and bringing issues to light so they can be addressed, and that's a win. 3. If you use your real name to post criticism and it is not well recieved, but you still believe your own side, then you won't be the only one to believe it. You get to have your name attached to the idea and people who still inevitably agree with you can reach out and give you opportunties. Plus you can feel liberated to put your energy elsewhere. Why would you want to work with people who don't agree with you in cases relevant to your work? And if it isn't relevant to your work, oh well. Now, here's the real win: if you are proven right in the longrun! Think of the points in and outside the movement. Example:

Hello Ivy! I think you've missed at least one scenario, which is where you use your real name, your criticism is not well received, you have identified yourself as a troublemaker, and those in positions of power torpedo you. Surely this is a possibility? Unless people think it's a serious possibility, it's hard to make sense of why people write things anonymously, or just stay silent.

0
Ivy Mazzola
1y
Honestly I don't think this is a good enough reason to post anonymously though... [Edit: In that I don't think this risk makes the expected value of using your name negative. Even if there is a serious chance, it's surely a relatively small one compared to other outcomes. Above I was attempting to show that there are pros to posting anonymously which outweigh the cons in expected value terms. I don't think that changes the calculus.]  I also think it may not be a con exactly if that happens. Or it's a pro and con nested together? Because I think EAs torpedoing people would be a serious issue that would need revealing. I don't get why people wouldn't want to use this moment to test the toxicity of the movement they want to be in tbh. Why would you just... not want to know if you'd be torpedoed? In that case it falls within #3 still: "If your critique is not well received... You can feel liberated to put your energy elsewhere."  And if you get torpedoed, you can always write an expose about that and help people who might be in a similar situation in the future! Maybe someday I'll be torpedoed tbh. If this is a thing that leaders will do to anyone, I'm probably weird enough and have done misinterpretable enough things. Maybe I'd deserve it and maybe not, but either way I'd like to know if it's going to happen so that I should move on. In case you can't tell, I have already had concerns like the anons seem to have, but I've decided I wouldn't  like thinking "Is EA for me or not?" It's not pleasant or motivating. Honestly it's amazing how much that question drags down your potential. I want the answer not to live with the question. As long as I don't get torpedoed when being myself (or develop serious problems with the movement), that answer is yes. Which is just much more workable. Why would I not want to know? EA is not like, sacred. It can and should be thrown out of your roster if it's a bad fit or a problematic entity. [Edit: I wonder if EAs just need to be taug
5
Aptdell
1y
Yes, this seems like the human default, and I think anyone who claims the default doesn't apply to them bears the burden of proof to demonstrate that. If people like Buck want to convince me that they're different, the best way to do it would be to give a satisfactory theory of why this happens, then explain the specific measures they take and why they believe those measures are effective (e.g. "this RCT found that the use of Technique X produced a large effect size; I always use Technique X in situations A, B, C"). A person who's succeeded in solving a problem should be able to demonstrate understanding of both the problem and its solution. Edit: This paper looks interesting https://pubmed.ncbi.nlm.nih.gov/22663351/
6
quinn
1y
"say things that feel true but take actual bravery" (as opposed to "perceived bravery" where they're mis-estimating how unpopular a sentiment is) is definitely a high variance strategy for building relationships in which you're valued and respected, unavailable to the risk-intolerant, can backfire. 

I think there's two separate dynamics at play here:

I think we could do more to avoid punishing opinions perceived wrong. An example of punishing behavior at play is my own comment two days ago. I made it while being too upset for my own good and lashed out at someone for making a completely reasonable point.

I don't blame the user I replied to for wanting an anonymous account when that is the response they can expect. 

 

 

Secondly, I suspect that people are vastly overrating just how much anybody cares about who says what on the forum.

While I understand why someone making a direct attack on some organization or person might want to stay anonymous, most things I see posted from anonymous accounts seem to just be regular opinions that are phrased with more hostility than the mean.

It's a bit weird to me why somebody would think that a few comments on the EA forum would do all that much to ones reputation. At ea globals, at most, I've had a few people say "oh you are the one who wrote that snakebite post? Cool!" and that's about it.

It all feels very paranoid to me. I'm way too busy worrying about whether I look fat in this t-shirt to notice or care that somebody wrote a com... (read more)

I don't know. Reading the post from the burner account that wants to get rid of weird people, and  who thinks my wife is a poisonous creep because she views  inviting coworkers to BDSM parties is a completely reasonable thing to do  made me actually understand for the first time the emotions that  underly parts of cancel culture. 

This both makes me think that my instinct that he shouldn't be hired for anything should be taken less seriously, and makes me think that I took cancel culture style concerns about simply not wanting to work near a person who has certain attitudes about the sort of person that you are actually should be taken at least somewhat seriously.

It probably was wise for him to go anonymous before saying that weird people like me are generally bad actors who do not deserve an assumption that we are speaking in good faith (I know that is not what he actually said. That is what it feels like he said, and I suspect if I ever was in a position where I knew who he was, and was deciding whether he should get it, I would not be able to ignore my sense that he is attacking me personally and my right to exist, and assess him in the way that I would look at other candidates -- and this is despite knowing that my reading of what he said  is uncharitable).

Could you explain why you and your wife think inviting coworkers to a BDSM party is not a problem? I am genuinely curious for your perspective.

I think people might be imagining some pretty different situations? Compare:

  1. Employee A approaches new hire B at lunch and says "I'm putting together a BDSM party this weekend, let me know if that's the sort of thing you might be into."

  2. Employees A and B have become close over a long time working together. They talk about a lot of things, and have gradually become more comfortable sharing details about their personal lives. At this point they both know that the other is into BDSM, and A invites B to a BDSM party they're organizing.

[EDIT: in both cases imagine the employees are at the same level, and not in each other's management chains]

There's a continuum from 1 to 2, and while I do expect some of the disagreement here is about whether to treat BDSM parties as different from other social activities (if #1 was about a board game party then it would probably be widely viewed as welcoming) my guess is most of it is how much information people are imagining A has about whether B would like to receive an invitation?

5
Wil Perkins
1y
A board game party is very different. There have been reams and reams written about power dynamics in corporations and the issues that sex brings into it. I find it concerning that so many EA's are so blase about this topic. Especially when it comes to something like BDSM, that explicitly has dominance and submission in most circumstances. That seems like a ridiculously unhealthy dynamic if, for instance, someone invited a direct report or an employee to a BDSM party. I don't think the amount of time spent getting to know the employee matters too much. Imagine being an employee, your boss invites you to one of these parties, and then asks you to perform submissive acts. I find that incredibly problematic, and it sounds like this exact scenario happens frequently at big EA orgs. I wouldn't go so far as to say all sexual relations between coworkers should be banned, but I do think the current norms are unhealthy. Not sure exactly how to fix it without being unnecessarily draconian though.   Edit: I'll also add that, as Tim  admits (and I truly appreciate the candor here), when it comes to things like sex it's very difficult to disentangle emotions. Tim would feel personally attacked if someone doesn't like his and his wife's sexual practices. In reality I see this very easily extending to people who accept sexual advances. I think it would be extremely difficult for a boss to be neutral when it comes to decided to promote an employee if they have slept with one of them. 

BDSM is not primarily about sex, and sex mostly does not happen at BDSM parties, at least not the ones that I've been at. A sex party is a different thing. The impression I get from your comment is that you are not very familiar with the BDSM scene -- though I might be wrong. There isn't any tell in it that shows that you definitely are ignorant about a basic fact, it is just a vibe I'm feeling.

In either case, as far as I know, neither of us work at an EA org, and from your comment it seems like you imagine that what happens at these parties is very different from what I imagine happens at them (which is not to say I'm correct, they probably occur in the Bay Area or London, where the scene is very different, and vastly bigger than where I live).

Also, I think we have a different set of priors here about sex, relationships and careers. 

And again, I am self employed, and have been for the entire time I've earned meaningful money, and I'm male, so my intuitive pov is likely missing important things. And also, my resistance to changing norms in EA around sex is not about thinking that there shouldn't be a norm where managers don't sleep with subordinates in EA orgs --  there p... (read more)

7
Geoffrey Miller
1y
Tim -- excellent comment. I agree that a lot of the EA people who seem to be freaking out about the very idea of being invited to BSDM events seem to know less than nothing about BDSM, and are relying on third-hand media stereotypes about the subculture.  A good rule of thumb about highly stigmatized sexual subcultures is, if one  hasn't read anything about them, hasn't watched any inteviews with people in the subculture, hasn't gone to any events in the subculture, and doesn't have any close friends involved in the subculture, then one's takes about the subculture are likely to have very low epistemic quality. 

a lot of the EA people who seem to be freaking out about the very idea of being invited to BSDM events seem to know less than nothing about BDSM, and are relying on third-hand media stereotypes about the subculture

On the other hand we're talking about situations where someone is inviting their coworkers to BDSM parties. While (as I said above) I think this can be ok if the asker already knows the askee is into this kind of thing, consider the more dubious cases where the asker doesn't:

A: I'm putting together a BDSM party this weekend, let me know if that's the sort of thing you might be into.

B: Um, no thanks.

How B feels here depends mostly on their likely-uninformed understanding of what happens at these parties.

2
Geoffrey Miller
1y
Jeff -- fair point.  I guess the key issue is, who's responsible for having misunderstandings and stereotypes about a popular sexual subculture, if those misunderstandings and stereotypes lead to negative reactions or to offense being taken. 
3
Lauren Maria
1y
I don’t think it’s necessarily about having misunderstandings or stereotypes. I was the original person who commented this. I think people have different levels of comfort when it comes to mixing their sex and work lives. I personally have strong boundaries in professional settings. Ultimately I think everyone has different preferences here, and I get the sense that EA groups maybe have a slightly different culture than what I’m used to when it comes to personal/professional boundaries. Should that be changed? I’m not so sure, I was mostly posting it as a question, and to show my own perspective.
6
Wil Perkins
1y
Thanks for the thoughtful reply! I am indeed not aware of how BDSM culture operates and definitely made a lot of assumptions. My apologies for that! I guess I agree we should welcome people, and my general sense is that optimal sexual/relationship norms should be far more open and less prudish than they are today. As I’ve said elsewhere though, when work and large decisions regarding money/power get involved l become wary of sexual relationships.
2
Jeff Kaufman
1y
I agree that a board game party is very different and was trying to say so; I'm sorry my comment was confusing here. Rephrasing: * I see two main explanations for why people might disagree about the scenario: (i) disagreement about how much information A has about whether B would like to receive an invitation and (ii) disagreement about how we should socially treat BDSM parties vs other parties. * Even though there isn't consensus on (ii), I think most of the disagreement on "can it be ok to invite coworkes to BDSM parties" is around (i). Edited my comment to specify that both people are at the same level. Wait, what? This isn't something I'd heard of -- what makes you think it happens at all, let alone is frequent?
1
Wil Perkins
1y
A mix of anecdotal stories from friends, and the common posts up here, like Tim, clearly claiming that this happens and he and his wife are part of it.  Perhaps the 'frequently' is uncharitable, I'm admittedly a bit frustrated on this topic. Still it seems like you aren't really answering the core of my criticism - that this creates extremely unhealthy and abusive power dynamics...
4
quinn
1y
You might be underrating the role of theatricality in the culture, the role silliness plays in the appeal for a lot of people, not to mention the high bar for communication and introspection. Heck, people of the culture are known to say things like "can I hug you?" which is more interpersonally scrupulous than the base rate "normal" person who, I've observed, seem use this "hugs for women; handshakes for men" heuristic that it's super autistic to second guess.  I think if the community theater was putting on King Lear, and my boss got the part of the king, and I got the part of a servant, no one would think there's a CoI or unscrupulosity, right? or if I got the part of Lear and my boss got the part of a servant, whatever.  (flagging that I'm hesitating to participate in subthread because topicalness, but didn't think leading a pivot to DMs made sense since the cat was already out of the bag) 
1
Wil Perkins
1y
I'm admittedly not in the polyamory/SF/Rationalist part of the EA movement, and going based off second or third-hand stories and things posted on the EA forum.
3
timunderwood
1y
I didn't claim anything of that sort. Neither of us work for an EA org. My wife strongly does not identify as EA. I'm not even BDSM. I just claim my wife would consider inviting an interested coworker to a munch or a party as a totally reasonable thing to do if it naturally came up.
2
Jeff Kaufman
1y
Tim said his wife "views inviting coworkers to BDSM parties is a completely reasonable thing to do". I agree that this is ambiguous, which is why I wrote my comment. But I don't see how you can interpret it as a clear claim that she is a manager at an EA org who invites her direct reports to BDSM parties and asks them to perform submissive acts? (Unless your "this exact scenario" was intended to be hyperbole?) Can you say more about what you mean by "this"? If you mean inviting subordinates then I agree, but not if you mean inviting equals you have strong reason to think would like to be invited.
1
Wil Perkins
1y
I think inviting equals you have strong reason to think would be invited is also problematic. People don’t stay equals forever. They get promoted, move to other orgs, leave and receive grants, etc. Like I said I don’t have a fully fleshed out way to fix the problem of sex influencing decisions that should be made impartially, but it’s clearly a problem in my view.
2
Jeff Kaufman
1y
Do you also think it's a problem for equals to start dating?
4
Wil Perkins
1y
Without the addition of polyamory, I think it's fine in a limited sense. However if every single person in a position of power in EA is also dating another person of power, even monogamously, I would see that as a problem. Adding in polyamory on top of that complicates things to a far greater degree. As a disclaimer I have nothing against polyamory, I just think it can lead to unhealthy workplace dynamics if it becomes a norm.
3
timunderwood
1y
If you have an office with a laid-back culture (ie the sort of place that you'd want to work at anyways), and it is a topic of conversation that came up naturally, and the coworker actually seems interested, why not invite them to a beginner friendly thing related to one of your main hobbies?
2
Wil Perkins
1y
I suppose when you phrase it this way it’s less weird sounding, I may be convinced after a long discussion. But my current prior is that lord of casual sex with coworkers is overall a large negative for an organization as it makes it far more difficult to make decisions based on the mission of the org instead of personal feelings.
-7
timunderwood
1y

I have had an influential senior EA explicitly deny support for my project because they didn't like my opinions and decided on my behalf that I wasn't going to suitably change them (obviously I can't be very specific, not least because they didn't tell me which opinions were unacceptable, so I can't say much to show whether this might have been justified in my case. But for the record, I'm not into justifying oppression, harassing people, defrauding people, or any of the classic reasons to kick someone out of an EA party)

I would be a lot more comfortable putting my name and more details to this anecdote if the general attitude of senior EA staff was more 'shit, all these anecdotes should strongly update us towards the belief that this happens. We should actively try to find out about such cases and do something about them' rather than 'we polled each other and everyone said they hadn't done this, so I guess it doesn't happen'. 

Maybe there's nothing more they could actually do; but until they figure something out I will feel compelled to use a burner account for certain criticisms.

6
quinn
1y
It's hard.  On the one hand, Duncan is thoroughly dead right in this subthread. There's a bizarre conspiracism and entitlement around experiencing disagreement that I think is poisonous.  On the other hand, formal and informal power itself poisons proper disagreement. We kinda should expect discourse to be like this.  One time I misled a company about how cool I thought their projects were because I was poor and they had money. In my life, a breach of personal integrity like this is well quarantined from EA, I'd rather eat chicken 4x a week for a year and kick a puppy for good measure than manipulate EAs, and I don't think formal or informal power asymmetries mean that manipulation would be "justified defense" or "punching up". But when I try to reverse engineer this mindset I'm detecting from the "it was wrong of the employer not to hire someone they in fact disagree with about the fundamentals" crowd, I find myself wondering: are people poor? Is that why it gets adversarial? It's ok if that's what's going on with some of them, I don't think it justifies manipulative mindset or strategy against EAs and I think they should go manipulate and be strategic in a low stakes setting until their brain repairs itself from the damages of poverty, but it's something I could understand.  There's a related topic of making EA orgs robust to password guessing, but I'm not running an org so that's none of my business. But if I have colleagues who are in the password guessing mindset, I absolutely want to know about it, and I want them to bust out of that mindset. 
3
Duncan Sabien
1y
In other words: "I'd rather extract money/support from people who wouldn't willingly give it to me, if I were being candid." :/ :/ :/ :/ Like, I'm not saying I don't get it, but the "it" that I'm getting seems super duper sad. People's money/support/endorsement should be theirs to give, and tricking people into giving it to you when they wouldn't if they knew your true positions seems ... not great.

How is anonymous posting any different in this regard than self-censoring by not posting at all? People don't owe anyone their thoughts on the Forum, and deciding to be silent (or post anonymously) isn't "tricking people".

8
Duncan Sabien
1y
It greatly increases the odds of the forum being flooded with unaccountable bs; it removes/breaks the feedback loop of reputation. Deciding to be silent isn't tricking people. Posting anonymously because you don't want to be associated with your own views (but you want to inject those views without paying the reputational cost of having them) is.
9
Aptdell
1y
There are low-bs forums such as Hacker News and slatestarcodex where most people don't use their real names. I'm skeptical that real name use is a good predictor of bs density. There are 2 possibilities: either the view is correct, or the view is incorrect. If the view is correct, the reputational cost is bad, and the person did a service by posting it anonymously. If the view is incorrect, "injecting" it by posting it anonymously seems not that bad, because others can reply and explain why it's incorrect. Insofar as we respect the ability of forum readers to evaluate the correctness of what they read, we shouldn't be too worried about people posting incorrect views. It's through discussion itself that we can better determine which views are correct vs incorrect. Posting anonymously does not instill a false belief in another person, so I don't see the trickery. I found your comment concerning. Important-to-enforce punitive measures for wrongthink could be justified in certain limited situations (maybe stuff like "justifying oppression, harassing people, defrauding people"), but let's just say that if we're running into those situations on a regular basis, the EA movement is doing something very wrong.
8
Duncan Sabien
1y
In those forums, reputation accrues to username; little (or at least less) attention is paid to brand-new accounts. Here, a lot of accounts are trying to recruit/use the "I'm a for real serious longtime member of this community" reputational/seriousness boost, while being who the heck knows who.
7
Jason
1y
Then one solution would be to have a trusted third party vet the burner's identity under an NDA that allows them to verify agreed-upon non-identifying information, such as EAG attendance history, employment history, etc.

If anyone wants me to validate their otherwise anonymous account I'd be happy to do that.

1
Duncan Sabien
1y
I would support that.
3
Aptdell
1y
I've spent a lot of time on both HN and slatestarcodex (subreddit/blog comments), and this isn't really the case. Most usernames I see are ones that I have no particular recollection of. I basically never look at the username to decide whether to read a comment. HN will display your username in green for the first two weeks after you register, and you get the ability to downvote after accumulating 500 karma, but for the most part people ignore usernames. (Example: I once got a reply from a user who said "I keep seeing people make this argument." It took me a little while to realize that was because the two of us had a related discussion a few weeks ago, where I'd made this argument to them. It was only after I looked through my comment history for the older discussion that I realized what was going on.) An exception here is that in both communities there are a few celebrity users that get upvotes more easily, but they're a small minority. If you want you could create a post in one of those communities asking people how much they pay attention to usernames and see what responses you get. Maybe a trusted neutral party could vouch for these claims?
8
Jason
1y
We have a system for deciding if a post is "bs" -- the upvote/downvote system. The community is quite capable of dealing with actual "bs" by downvoting it into oblivion. The evidence is that the community does not find the bulk of burner-poster posts to be "bs". It is also very easy for users who do not want to engage with burner posts to skip on past them. The engagement that happens is with users who have chosen to engage with that content.
7
Duncan Sabien
1y
I disagree that the community is doing anything remotely close to a good job of distinguishing bs from non-bs via downvotes. [The evidence is that the community does not find the bulk of burner-poster posts to be "bs"] is a true statement, and is revealing the problem. This is straight false; they're showing up on all sorts of posts WAY more than they used to.
4
sick_burnerer
1y
I wasn't hiding anything at the time - my views were evidently public when I spoke to this person. Maybe I didn't make this clear enough in the first post: by clear inference and implication, I was turned down because of critical views on senior EAs/orgs - because I don't have any other views that anyone ever suggested would be problematic in this context. I'm hiding now because of that same experience. As Jason says, I didn't need to post, and it's quite unpleasant to do so. I posted to contradict the narrative promoted by senior EAs that criticising EA leadership has no reputational risk - which seems highly relevant to a thread telling people they shouldn't use burner account to criticise EA leadership.
-22
Duncan Sabien
1y

I think as this place becomes a place that gets quoted in external publications more it will be harder to speak freely as a named account without a lot of practice, a very thick skin. I don't really know what to do here. 

I have spent so long on twitter that I red team almost everything I say for "being able to be taken out of context" even while trying to be very honest and straightforward. If one hasn't done this work, then it seems easy to avoid the anxiety of being quoted as "a forum commenter, Adam Parsons thought" by saying things anonymously.

I think as this place becomes a place that gets quoted in external publications more it will be harder to speak freely as a named account without a lot of practice, a very thick skin. 

I mean, perhaps easier said than done, but some of my main advice would be: grow a thicker skin, and spend less time anxiously refreshing social media or focusing your attention on low-quality hit pieces.

EA is prominent enough now that we will in fact get a lot less good done in the world if we devote a day or two to reading and debating every news article that mentions us. (Bad-but-illustrative example: Imagine trying to solve a physics problem while also feeling a panicky impulse to read every news article in the world that mentions physicists.)

Writing under stable pseudonyms also seems like it captures a lot of the value of writing stably under your real name.

I have spent so long on twitter that I red team almost everything I say for "being able to be taken out of context"

That... sounds incredibly unhealthy to me, and makes me think we need better alternatives to Twitter -- some platform where it's easier to just ignore haters and talk to people who are there to do collaborative truth-seeking and good-faith meeting of minds.

If EAs feel pressure to do that level of red-teaming whenever they post on Twitter, I'd suggest they strongly consider not using Twitter.

1
NickLaing
1y
Thanks for this Nathan I should red team my comments more. At times I probably write too much and too quickly without considering some potentially incriminating readings. I think this comment is OK though Or is it?

Something to note is that by posting my opinions I do in fact reveal how dumb I am. I mean, in terms of size to wrongness ratio, I've got to be doing pretty well :P. I think commenting under your real name does have real costs, people message me and go "no that was dumb" about twice a week across various media. 

Probably if I wanted better job prospects I would hide this better. I have literally been rejected from jobs before for saying "I'm concerned that I might do a bad job here" and them being like "we didn't feel like you wanted it". But I guess I take honesty (I was genuinely uncertain) over "get the best jobs you can". You're welcome.

For most people, the benefit that accrues to you from signing your real name to a controversial post seems pretty minimal. Using one's name creates some increase in author credibility, and thus effectiveness -- although less so on certain types of posts, and where the author doesn't have much of a reputation either way.  Otherwise, there seems to be little incentive to do it if you think your post may be unfavorably received by a significant number of people. So even if you assign only a small probability to "postings of the sort I am making will have adverse career effects for me," the decision of whether to sign your post is likely to be EV-negative to you.

(There's also the Google effect, although that can be solved with the use of a consistent psuedonym that is not publicly linked to one's name.)

Buck
1y20
7
5

I disagree, I think that making controversial posts under your real name can improve your reputation in the EA community in ways that help your ability to do good. For example, I think I've personally benefited a lot from saying things that were controversial under my real name over the years (including before I worked at EA orgs).

3
RyanCarey
1y
Yes, but you've usually been arguing in favour of (or at least widening the overton window around) elite EA views vs the views of the EA masses, have been very close to EA leadership, and are super disagreeable - you are unrepresentative on many relevant axes.
9
Jan_Kulveit
1y
In my view this is an example of a mistake in bounded/local consequentialism From deontic perspective, there is a coordination problem, where "at least consistent handle" posts can be somewhat costly for the poster, but an atmosphere of an earnest discussion of real people  has large social benefits. Vice versa,  discussion with a large fraction of anonymous accounts - in particular if they are sniping at real people and each other - decreases trust, and is vulnerable to manipulation by sock puppets and nefarious players.  Also, I think there are some virtue ethics costs associated with anonymous posts, roughly in the direction of transparency and integrity.   For example,  if I imagine myself anonymously posting something critical received unfavourably by someone, and later, meeting that someone in person, or collaborating on something relevant, I would find it integrity-decreasing to continue hiding the authorship.  And if I'd be happy to reveal my identity to the people upset ... why not reveal it directly? While I don't think these considerations add up to "never post anonymously", I think they are pretty large, and usually much larger than e.g. "small probability of adverse career effects in the EA ecosystem".

I think posting under pseudonyms makes sense for EAs who are young[1], who are unsure what they want to do with their lives, and/or people who have a decent chance of wanting jobs that require discretion in the future, e.g. jobs in politics or government. 

I know at least some governance people who likely regret being openly tied with Future Fund and adjacent entities after the recent debacle. 

Also in general I'm confused about how the tides of what's "permissible internet dirt to dig up on people" will change in the future. Things might get either better or much worse, and in the worse worlds there's some option value left in making sure our movement doesn't unintentionally taint the futures of some extremely smart, well-meaning, and agentic young people.

That said, I personally prefer pseudonymous account names with a continued history like Larks or AppliedDivinityStudies[2], rather than anonymous accounts that draw attention to their anonymity, like whistleblower1984. 

  1. ^

     <22? Maybe <25, I'm not sure. One important factor to keep track of is how likely you are to dramatically change your mind in politically relevant ways, e.g. I think if you're currently a

... (read more)

I'm not particularly young anymore, and work in a non-EA field where reputation is a concern, which is a large part of why I post pseudonymously. I think it would be bad if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs". 

2
Arepo
1y
I use a semi-pseudonymous account for nominal privacy, but it isn't very secure, both because in practice various well-meaning people (inc me at one point) have mentioned my name alongside it without thinking about it much - which is surely inevitable for anyone who keeps a pseudonym going for long enough - and because anyone with access to the forum's database could get the email associated with it.
4
Jeff Kaufman
1y
That at least seems easy to fix: register your pseudonym with a new email address that you configure to automatically forward to your main one.

Quinn -- I agree that over-use of anonymous & burner accounts is becoming a significant problem in EA Forum (and in social media generally).

To put this in the broader  context of cancel culture: there seems to be a common Gen Z/Millennial narrative that says: 'Older established professionals actively seek any possible reason to penalize, ostracize, and harm any younger people who speak up, complain, or criticize any aspect of current practices, systems, and ideas; these older professionals are ruthless, biased, unforgiving, and eager to harm our reputations and careers; they pay enormous attention to everything we say, and they never forget or forgive any criticism; therefore, the only possible response is for us to make our complaints and criticisms from behind the veil of anonymous burner accounts'.

I think that's usually a false, harmful, and self-defeating narrative, and it seems especially inaccurate applied to EA culture. But it's a very useful narrative, because it empowers people to engage in cancel culture, anonymously and self-righteously, hiding behind the moral armor of 'I'm so brave to speak up at all, and look, it's so dangerous to do so that I had to use an a... (read more)

A few relevant thoughts from a Facebook thread of mine yesterday:

Man, I feel *super* creeped out by how many posts and comments I'm seeing on the EA forum from anonymous this, burneraccount that, groupofnamelessconcernedcitizens, etc.

Seems like a sign of *something* real bad, even though I haven't quite put my finger on what, and doesn't seem to me like "the light as people come out of the tunnel" in the sense that it doesn't feel to me like a precursor to something good, or people finally overcoming coordination challenges in a bad environment, or whateve

... (read more)

Anonymous accounts created within the very recent past I found just from skimming the posts and comments from the past few days for, like, five minutes:

BurnerAcct

OutsideView

whistleblower9

anonymousEA20

Sick_of_this

AnonymousAccount

temp_

Burner1989

AnonymousQualy

ConcernedEAs

If you have time to have a look at my post and recent comments, would you say that this account creeps you out, or only the more EA-critical ones?

The alternative is not really to post these things under my real name, but not to post at all (for various reasons: don't want the pro-EA posts to be seen as virtue signaling, don't want to be canceled in 26 years for whatever will be cancelable then, don't want my friends to get secondhand reputation damage)

Pro-EA posts made anonymously creep me out 98% as much; I personally would rather (most) anonymous posts not happen at all than happen anonymously. See above for my caveat to that general position.

5
projectionconfusion
1y
I have a job outside EA where reputation is a concern, so as is normal for people in such industries I post mostly anonymously online, and start new accounts periodically to prevent potential information leakage. If the only way to engage with EA discussion online was under my  real name I wouldn't do so. That's probably on the extreme end, but I think lots of people exist somewhere on this spectrum and it would probably be bad for the movement if discussions were limited to only people willing to post under their real names, or persistent identities, as that would exacerbate problems of insularity and group think.
7
Jeff Kaufman
1y
Another example: here's a comment created with a new anonymous account, I think so they could use the "username" field as a subject line for their comment?
-1
Ivy Mazzola
1y
Wow that's really hard-defecting on forum norms

I think, trying to introspect on the creepies, that a big part of the problem is something like:

"Here I am, in substantial disagreement/criticism of a subculture, but I don't want my large and fairly crucial disagreement attached to my name because if people understood my *true* beliefs they might not want to hire me at their org."

Hm, I hadn't thought about it in those terms, but I guess that is a little weird.

I tend to like people being able to weigh in on stuff anonymously, as long as it doesn't dominate the discussion. (If there are tons of anons in a discussion, I start to worry more about sockpuppetry.)

And, e.g., if you're writing a sweeping critique of medical culture while trying to start a career in medicine, it makes sense that you might want to post pseudonymously because of the potential career repercussions.

But I guess it's a little odd to write a sweeping critique of EA culture, and hide your identity in the hope of working at an EA org? Getting an EA job is an altruistic goal, where the quality of the mission and strategy presumably matters a great deal for where you want to work.

If EA is unresponsive to your awesome critique, then if the critique is important enough,... (read more)

 I don't think people are thinking of it as "muhaha, now I can infiltrate EA orgs who don't know my real views on anything", I'm guessing it's more that some people (a) haven't thought through the upside of loudly signaling their views (to spark useful debates, and find like-minded EAs to team up with), or (b) haven't thought through the downside of working at an EA org where you have to keep your real views about lots of important EA things a secret.

Thinking about it more, I could imagine a thing here like 'EAs wanting to put their best foot forward'. Maybe you endorse a post you wrote, but you don't think it reflects your very best work, so you're wary of it being the first thing people see from you. Whereas once you've already proven yourself to someone, you might feel more comfortable sharing your full thinking with them; there's plenty of middle ground between "wanting critique X to be the first thing everyone hears about you" and "wanting to hide critique X from your co-workers".

Or they might indeed think that their critique isn't that important, such that it's not the specific hill they want to die on; if their post doesn't get a really positive reception on their first attempt, it may be something they'd rather drop than keep fighting for, while also being too minor for them to want to use it as a filter for which projects they'd like to work at.

2
Jeff Kaufman
6mo
Can you say more? That doesn't seem to be a situation with burner accounts, but maybe you're saying that Kurt trying to be anonymous would have gone more or less well?
2
quinn
6mo
I'm inferring that the kind of reasons Kurt had for not talking about this as much (in, say, earshot of me) before now are the exact same kind of reasons people are overall intimidated / feel it's too scary to not use burner accounts for things. 
2
Jeff Kaufman
6mo
Thanks! Though, I think it's common that people think burner accounts give them more protection than they actually do, unless combined with something like running through an LLM for restyling. More: Linking Alt Accounts.

Speaking only for ConcernedEAs, we are likely to continue remaining anonymous until costly signals are sent that making deep critiques in public will not damage one's career/funding/social prospects within EA.

We go into more detail in Doing EA Better, most notably here:

Prominent funders have said that they value moderation and pluralism, and thus people (like the writers of this post) should feel comfortable sharing their real views when they apply for funding, no matter how critical they are of orthodoxy.

This is admirable, and we are sure that they are be

... (read more)
4
Robi Rahman
1y
I'm willing to make a very costly signal to help you test your theory. Would both outcomes of the following experiment update your opinions in opposite directions? Here is my idea: I'll open up a grant form from EV, OP, LTFF, etc, and write a grant proposal for a project I actually want to execute. Instead of submitting it right away, I'll post my final draft on the forum, and ask several grant makers to read it and say how much funding they'd be willing to give me. Then I'll make another post, containing very harsh criticism of EA or prominent EA organizations or leadership, such as the aforementioned funding orgs or their top executives. (You can suggest criticisms you want me to post, otherwise I'll aim to write up the most offensive-to-EA-orthodoxy thing I actually believe or can pass an ITT about.) Finally, I submit the grant proposal to the org I just criticized and see how much money I get. If this gets as much funding as the reviewers estimated, this is at least weak evidence that public criticism of EA doesn't hurt your prospects. If I get less funding, I'll admit that public criticism of EA can damage the person making the criticism. Agreed?
5
Jeff Kaufman
1y
Aside from the bit where you publish something you don't believe and wouldn't otherwise write, and how estimates ahead of time might not be that predictive of how funding actually goes, talking about how you plan to do this on the Forum means this probably doesn't work at all. Someone at the grant making organization sees your harshly critical post, thinks "that's a really surprising thing to come from Robi", someone else points out that you're doing it as an experiment and links your comment, ...
4
Robi Rahman
1y
Crap. I guess I should've posted the above comment from a burner account... But anyway, serious reply: I thought of all of those problems already, and have several solutions for them. (For example, have someone who is not known to the grantmakers to be connected to me to do the experiment instead of me.) ConcernedEAs, would you accept this experiment if I propose a satisfactory variation, or in principle if it's not practically workable?
-4
quinn
1y
epistemic and emotional status: had a brief look at your post and some comments, got the impression that 400 comments didn't move the needle of your mind at all, which disappointed me.  I don't understand why you think you'd like to be a part of EA, the list of orthodoxies just seem like the movement's stated premises and goals (and yeah it'd be a problem if AMF was firing people for not being transhumanist, but I'd roll to disbelieve that something like that is actually happening). So what I'd suggest at about 65% confidence is a kind of broad "your reasons for anonymity are deep down reasons to try making a living in another philanthropic ecosystem" or a harsher heuristic I'm only about 25% confident in which is "the urge to be anonymous is a signal you don't belong here".  I'm glad the decentralization discourse is in the overton window (I have a few sketches and unfinished projects in the parallel/distributed epistemics space, I intermittently study a mechanism design textbook so I can take a crack at contributions in the space), but I haven't seen a good contribution from the pro-decentralization side that came from the folks who are talking about fearing retribution for their bold views.
Jason
1y29
11
0

If ConcernedEAs posted with their real names, would you be less likely to hire them for an EA role? Even if not, would you agree that ConcernedEAs might reasonably draw that conclusion from your comment suggesting they might not belong here?

4
quinn
1y
Sorry for any terseness I lack, and this may get out of scope or better placed in their original post's comments. Keep in mind I'm not someone who's opinion matters about this.  Plausibly, but who knows. Inclusivity failures are not an indictment. Sometimes knowing you disagree with an institution is a prediction that working together wouldn't go super well.  As a baseline, recall that in normie capitalism hiring discrimination on alignment happens all the time. You have to at least pretend to care. Small orgs have higher standards of this "pretending to care" than large orgs (Gwern's fermstimate of the proportion of amazon employees who "actually" care about same day delivery). Some would even say that to pull off working at a small org you have to actually care, and some would say that most EA orgs have more in common with startup than enterprise. But ConcernedEAs do care. They care enough to write a big post. So it's looking good for them so far.  I probably converge with them on the idea that ideological purity and the accurate recitation of shibboleths is a very bad screening tool for any org. The more we have movement wide cohesion, the greater a threat this is.  So like, individual orgs should use their judgment and standards analogous to a startup avoiding hiring someone who openly doesn't care about the customers or product. That doesn't mean a deep amount of conformity.  So with the caveat that it depends a lot on the metric ton of variables that go into whether someone seems like a viable employee, in addition to their domain/object-level contributions/potential/expertise, and with deep and immense emphasis on all the reasons a hiring decision might not go through, I don't think they're disqualified from most projects. The point is that due to the nitty gritty, there may be some projects they're disqualified from, and this is good and efficient. Rather, it would be good and efficient if they weren't anonymous. 

Aside: I also recommend not immediately disabling my account when I comment once, as this creates weird consequent dynamics that are difficult to control.

Obviously, when someone keeps making dummy accounts over and over again to circumvent forum moderation, they should be disabled immediately. (Also, you should stop doing that.)

I've been trying it out for a while now before the posts on why there are burner accounts on the forum during the last few days, though I've been trying out this thing where I've posted lots of stuff most effective altruists may not be willing to post. I cover some of the how and why that is here.

https://forum.effectivealtruism.org/posts/KfwFDkfQFQ4kAurwH/evan_gaensbauer-s-shortform?commentId=EjhBquXGiFEppdecY

I've been meaning to ask if there is anything anyone else thinks I should do with that, so I'm open to suggestions.

[comment deleted]1y2
0
0
More from quinn
Curated and popular this week
Relevant opportunities