Lumpyproletariat

739Joined Jul 2020

Bio

Lumpy is an undergraduate at some state college somewhere in the States. He isn't an interesting person and interesting things seldom happen to him.

Among his skills are such diverse elements as linguistic tomfoolery, procrastination, being terrible with computers yet running Linux anyway, a genial temperament and magnanimous spirit, a fairly swell necktie if he does say so himself, mounting dread, and quiet desperation.

Plays as a wizard in any table top or video game where that's an option, regardless of whether it's a [i]strong[/i] option. Has never failed a Hogwarts sorting test, of any sort or on any platform. (If you were about to say how one can't fail a sorting test . . . one surmises that you didn't make Ravenclaw.) Read The Fellowship, Two Towers, and Return of the King over the course of three sleepless days at age seven; couldn't keep down solid food after, because he'd forgotten to eat. Was really into the MBTI as a tweenager; thought it ridiculous how people said that no personality type was "better" than the others when ENTJ is clearly the most powerful. (Scored INFP, his self, but hey, one out of four isn't so bad. (However, found a better fit in INTP.)) Out of the Disney princesses Lumpy is Mulan--that is, if one is willing to trust BuzzFeed. Which, alas, one is not.

No, but seriously.

Mulan?? 0_o

If, despite this exhaustive list of traits and deeds, your burning question is left unanswered, send a missive in private. Should your quest be noble and intentions pure, it is said that Lumpyproletariat might respond in kind.

Comments
99

I strong upvoted your comment because I disagreed that it should be at negative forum karma.

Provocation can shock people out of their normal way of seeing the world into looking at some fact in a different light. This seems to be roughly what Bostrom was saying in the first paragraph of his 1996 email. However, in the case of that email, it's unclear what socially valuable fact he was trying to shock people into seeing in a new way.


Bostrom's email was in response to someone who made the point you do here about provocation sometimes making people view things in a new light. The person who Bostrom was responding to advocated saying things in a blunt and shocking manner as a general strategy for communication. Bostrom was saying to them that sometimes, saying things in a blunt and shocking manner does nothing but rile people up.

Here are the last four things I remember seeing linked as supporting evidence in casual conversation on the EA forum, in no particular order:

https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=HebnLpj2pqyctd72F - link to Scott Alexander, "We have to stop it with the pointless infighting or it's all we will end up doing," is 'do x'-y if anything is. (It also sounds like a perfectly reasonable thing to say and a perfectly reasonable way to say it.)

https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=SCfBodrdQYZBA6RBy - separate links to Scott Alexander and Eliezer Yudkowsky, neither of which seem very 'do x'-y to me.

https://forum.effectivealtruism.org/posts/irhgjSgvocfrwnzRz/?commentId=NF9YQfrDGPcH6wYCb - link to Scott Alexander, seems somewhat though not extremely 'do x'-y to me. Also seems like a perfectly reasonable thing to say and I stand by saying it. 

https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=x5zqnevWR8MQHqqvd - link to Duncan Sabien, "I care about the lives we can save if we don't rush to conclusions, rush to anger, if we can give each other the benefit of the doubt for five freaking minutes and consider whether it'd make any sense whatsoever for the accusation de jour to be what it looks like," seems pretty darn 'do x'-y. I don't necessarily stand behind how strongly I came on there, I was in a pretty foul mood.

I think that mostly, this is just how people talk.

I am not making the stronger claim that there are zero people who hero-worship Eliezer Yudkowsky. 

Ah, I hadn't meant to use "vetting stage" as a term of art.

Gains from trade, and agglomeration effects, and economies of scale. Being effective is useful for doing good, having a lot of close friends and allies is useful for being effective.

I think it's pretty obvious at this point that Tegmark and FLI was seriously wronged, but I barely care about any wrong done to them and am largely uninterested in the question of whether it was wildly disproportionate or merely sickeningly disproportionate.

I care about the consequences of what we've done to them.

I care about how, in order to protect themselves from this community, the FLI is

working hard to continue improving the structure and process of our grantmaking processes, including more internal and (in appropriate cases) external review.  For starters, for organizations not already well-known to FLI or clearly unexceptionable (e.g. major universities), we will request and evaluate more information about the organization, its personnel, and its history before moving on to additional stages.

I care about how everyone who watched this happen will also realize the need to protect themselves from us by shuffling along and taking their own pulses. I care about the new but promising EAs who no one will take a chance on, the moonshots that won't be funded even though they'd save lives in expectation, the good ideas with "bad optics" that won't be acted on because of fear of backdraft on this forum. I care about the lives we can save if we don't rush to conclusions, rush to anger, if we can give each other the benefit of the doubt for five freaking minutes and consider whether it'd make any sense whatsoever for the accusation de jour to be what it looks like.

I barely give a gosh-guldarn about FLI or Tegmark outside of their (now reduced) capacity to reduce existential risk.

 

Obviously I'd rather bad things not happen to people and not happen to good people in particular, but I don't specifically know anyone from FLI and they are a feather on the scales next to the full set of strangers who I care about.

Eliezer is an incredible case of hero-worship - it's become the norm to just link to jargon he created as though it's enough to settle an argument.

I think that you misunderstand why people link to things.

If someone didn't get why I feel morally obligated to help people who live in distant countries, I would likely link them to Singer's drowning child thought experiment. Either during my explanation of how I feel, or in lieu of one if I were busy. 

This is not because I hero-worship Singer. This is not because I think his posts are scripture. This is because I broadly agree with the specific thing he said which I am linking, and he put it well, and he put it first, and there isn't a lot of point of duplicating that effort. If after reading you disagree, that's fine, I can be convinced. The argument can continue as long as it doesn't continue for reasons that are soundly refuted in the thing I just linked.

I link people to things pretty frequently in casual conversation. A lot of the time, I link them to something posted to the EA Forum or LessWrong. A lot of the time, it's something written by Eliezer Yudkowsky. This isn't because I hero-worship him, or that I think linking to something he said settles an argument - it's because I broadly agree with the specific thing I'm linking and don't see the point of duplicating effort. If after reading you disagree, that's fine, I can be convinced. The argument can continue as long as it doesn't continue for reasons that are soundly refuted in the thing I just linked.

There are a ton of people who I'd like to link to as frequently as I do Eliezer. But Eliezer wrote in short easily-digested essays, on the internet instead of as chapters in a paper book or pdf. He's easy to link to, so he gets linked.

It did not make it past the vetting stage. 

They did not award the grant.

There's an angry top-level post about evaporative cooling of group beliefs in EA that I haven't written yet, and won't until it would no longer be an angry one. That might mean that the best moment has passed, which will make me sad for not being strong enough to have competently written it earlier. You could describe this as my having been chilled out of the discourse, but I would instead describe it as my politely waiting until I am able and ready to explain my concerns in a collected and rational manner.

I am doing this because I care about carefully articulating what I'm worried about, because I think it's important that I communicate it clearly. I don't want to cause people to feel ambushed and embattled; I don't want to draw battle lines between me and the people who agree with me on 99% of everything. I don't want to engender offense that could fester into real and lasting animosity, in the very same people who if approached collaboratively would pull with me to solve our mutual problem out of mutual respect and love for the people who do good.

I don't want to contribute to the internal divisions growing in EA. To the extent that it is happening, we should all prefer to nip the involution in the bud - if one has ever been on team Everyone Who Logically Tries To Do The Most Good, there's nowhere to go but down.

I think that if I wrote an angry top-level post, it would deserve to be downvoted into oblivion, though I'm not sure it would be.

I think on the margin I'm fine with posts that will start fights being chilled. Angry infighting and polarization are poisonous to what we're trying to do.

Load More