51 karmaJoined Oct 2016


Meta: I'm glad you wrote this. Finding new cause areas seems like a really important activity for EAs to be engaging in.

Object: It's not at all clear to me that consciousness, whatever that means, is a precondition for value.

I think you're doing the thing shlevy described about being way too charitable to Gleb here. Outside view, the simplest hypothesis that explains essentially everything observed in the original post is that Gleb is an aggressive self-promoter who takes advantage of EA conversational norms to milk the EA community for money and attention.

It might be useful to reflect a little on what being manipulated feels like from the inside. An analogous dynamic in a relationship might be Alice trying very hard to understand why Bob sometimes behaves in ways that makes her uncomfortable, hypothesizing that maybe it's because Bob had a difficult childhood and finds it hard to get close to people... all the while ignoring that outside view, the simplest hypothesis that explains all of Bob's behavior is that he is manipulating her into giving him sex and affection. It's in some sense admirable for Alice to try to be charitable about Bob's behavior, but at some point 1) Alice is incentivizing terrible behavior on Bob's part and 2) the personal cost to Alice of putting up with Bob's shit is terrible and she shouldn't have to pay it.

I may be misinterpreting you here; you wrote

To re-iterate, it's delightful to be part of a community that responds to this sort of situation by spending ~100s of hours (collectively) and ~100k words (I'm counting the original Facebook thread as well as the post here) analysing the situation and producing a considered, charitable yet damning report.

and while I think this behavior is in some sense admirable, I think it is on net not delightful, and the huge waste of time it represents is bad on net except to the extent that it leads to better community norms around policing bad actors.

I literally still don't understand. I can understand the motivation to be an asshole in communities you think won't treat you fairly, but why be a lying asshole? I think the OP wrote "honesty" and meant something else.

I don't think this comparison holds water. Briefly, I think SI/MIRI would have mostly attracted criticism for being weird in various ways. As far as I can tell, Gleb is not acting weird; he is acting normal in the sense that he's making normal moves in a game (called Promote-Your-Organization-At-All-Costs) that other people in the community don't want him playing, especially not in a way that implicates other EA orgs by association.

Whatever you think of that object-level point, an independent meta-level point: it's also possible that the EA movement excluding SI/MIRI at some point would have been a reasonable move in expectation. Any policy for deciding who to kick out necessarily runs the risk of both false positives and false negatives, and pointing out that a particular policy would have caused some false positive or false negative in the past is not a strong argument against it in isolation.

I'd be much more inclined to act with honesty if I believed people would do an extremely thorough public invesitigation into everything I'd said, rather than just calling me names and walking away.

I don't understand what you're claiming here. Are you saying you'd be honest in a community if you thought it would investigate you a lot to determine your honesty, but dishonest otherwise? Why not just be honest in all communities, and leave the ones you don't like?

Witch hunting and attacks do nothing for anyone.

Attacking people who are bad protects other people in the community from having their time wasted or being hurt in other ways by bad people. Try putting yourself in the shoes of the sort of people who engage in witch hunts because they're genuinely afraid of witches, who if they existed would be capable of and willing to do great harm.

To be clear, it's admirable to want to avoid witch hunts against people who aren't witches and won't actually harm anyone. But sometimes there really are witches, and hunting them is less bad than not.

People can look at clear and concise summaries like the one above and come to their own conclusion. They don't need to be told what to believe and they don't need to be led into a groupthink.

This approach doesn't scale. Suppose the EA community eventually identifies 100 people at least as bad as Gleb in it, and so generates 100 separate posts like this (costing, what, 10k hours collectively?) that others have to read and come to their own conclusions about before they know who the bad actors in the EA community are. That's a lot to ask of every person who wants to join the EA community, not to mention everyone who's already in it, and the alternative is that newcomers don't know who not to trust.

The simplest approach that scales (both with the size of the community and with the size of the pool of bad actors in it) is to kick out the worst actors so nobody has to spend any additional time and/or effort wondering / figuring out how bad they are.

Pretty much agree with you and shlevy here, except that the wasting hundreds of collective hours carefully checking that Gleb is acting in bad faith seems more like a waste to me.

If the EA community were primarily a community that functioned in person, it would be easier and more natural to deal with bad actors like Gleb; people could privately (in small conversations, then bigger ones, none of which involve Gleb) discuss and come to a consensus about his badness, that consensus could spread in other private smallish then bigger conversations none of which involve Gleb, and people could either ignore Gleb until he goes away, or just not invite him to stuff, or explicitly kick him out in some way.

But in a community that primarily functions online, where by default conversations are public and involve everyone, including Gleb, the above dynamic is a lot harder to sustain, and instead the default approach to ostracism is public ostracism, which people interested in charitable conversational norms understandably want to avoid. But just not having ostracism at all isn't a workable alternative; sometimes bad actors creep into your community and you need an immune system capable of rejecting them. In many online communites this takes the form of a process for banning people; I don't know how workable this would be for the EA community, since my impression is that it's spread out across several platforms.

Load more