I hate to add to the number of FTX posts on the forum, but after some (imo) inappropriate and unkind memes[1] and comments in the Dank EA Memes fb group and elsewhere, I wanted to push back against what seems like a bandwagon of anger and ridicule spiralling too far, and I wish to call attention to it.

But first, I should point out that I personally, at this time, know not nearly enough to make confident conclusions regarding what's happened at FTX. That means I will not make any morally relevant judgments. I will especially not insinuate them without sufficient evidence. That just amounts to irresponsibly fuelling the bandwagon while maintaining plausible deniability, which is arguably worse.

You are not required to pretend to know more than you do just so you can empathise with the outrage of your friends. That shouldn't be how friendship works.

This topic is not without nuance. There's a good case to be made for why ridicule can be pro-social, and I think Alex makes it here:

"Ridicule makes clear our commitment to punishing ultimately harmful behavior, in a tit-for-tat sense; we are not the government so we cannot lock up wrongdoers, and acting as a vigilante assassin is precluded by other issues, so our top utility-realizing option is to meme harmful behavior out of the sphere of social acceptability."[2]

I don't disagree with condemning someone for having behaved unethically. It's a necessary part of maintaining civil society, and it enables people to cooperate and trade in good faith. But if you accuse someone of having ill-advisedly forsaken ethics in the (putative) service of the greater good, then retaliating by forsaking compassion in the service of unchecked mockery can't possibly make anything better.

Why bother with compassion, you might ask? After all, compassion is superfluous for positive-sum cooperation. What we really need for essential social institutions to work at all is widespread trust in the basic ethics of people we trade with. So when a public figure gets caught depreciating that trust, it's imperative that we send a strong signal that this is completely unacceptable.

This I all agree with. Judicious punishments are essential for safeguarding prevailing social institutions. Plain fact. But what if prevailing social institutions are unjust? When we jump on a bandwagon for humiliating the accused transgressor after their life has already fallen apart, we are exercising our instincts for mob justice, and we are indirectly strengthening the norm for coercing deviants more generally.

Advocating punitive attitudes trades off against advocating for compassion to some extent. Especially if the way you're trying to advocate for punishments is by means of gleefwly inflicting harm.

In a society where most people are all too eager to join in when they see their in-group massing against deviants, and where groups have wildly different opinions on who the deviants are in the first place, we need an alternative set of principles.

Compassion is a corrective on unjust social norms. It lets us see more clearly where prevailing ethics strays from what's kind and good. In essence, that's the whole purpose of effective altruism: to do better than the unquestioned norms that's been handed down to us.

Hence why I hope we can outgrow--or at least lend nuance to--our reflexive instinct to punish, and instead cultivate whatever embers of compassion we can find. Let that be our cultural contribution, because the alternative, advocating punitive sentiments, just isn't a neglected cause area.


  1. ^
  2. ^
Sorted by Click to highlight new comments since:

Disagree with this, and I think the EA community would be better off feeling their own feelings sometimes. Anger can be a very destructive emotion, and it's a healthy impulse to be wary of one's own anger. But as a community, EAs are suspicious of their own desires to a fault. Of course we're angry at Sam - he is destroying something we care about. 

People will be a lot happier if they stopped trying to rationalize themselves out of their own preferences. Not to mention a lot more convincing and personable to normies.

Welcome to the forum! I agree that EAs often have a really troubling relationship with their own feelings, and scruples to a fault. If you have strong reason to believe that Sam acted unethically, I have no objections against directing your feelings of anger at him. But I would urge people to carry their anger with dignity, both for the sake of community norms and their own sense of self-worth.

Hey, thanks for the warm welcome 👋. Not new to EA, but have never really posted on the forum much.

At this point, I'm pretty sure that FTX lent Alameda Research user funds in violation of its own terms of service (https://www.wsj.com/articles/ftx-tapped-into-customer-accounts-to-fund-risky-bets-setting-up-its-downfall-11668093732). 

Rather than carry anger with dignity, I think a lot of EAs try not to have any anger at all. Yeah, if I could transform myself into the Buddha and was able to replace anger with pure compassion, I would. But I'm not really able to do this, and if I'm going to feel angry about something, I think it's healthier for me to just feel it reason myself into not feeling it.

Scrolling through the SBF debacle memes of the broader world, I’m kinda gut-punched to see all the images of Sam and Caroline getting raped. I think it’s important that we not give in to to similar dynamics of dehumanizing or group fantasizing about violent punishments, and I think that starts with advice like this, Emrik.

...what? I haven't been outside for a spell, and I knew things were bad, but that's just broken.

(We can fix it, though! With faith, patience, and a whole lot of sitting in office chairs!)

I’m impressed by your emotional gamut that you can feel anger and compassion simultaneously! (I don’t tend to feel anger.)

Stuff I feel when I see such posts (DEAM is classy in comparison to Twitter):

  1. Compassion, because I still think they were basically well-intentioned, though evidently they made all the morally and strategically wrong calls.
  2. It feels like uncivilized vigilantism that makes me feel unsafe.
  3. They’re not ugly at all imo, so if I or anyone else is further from the beauty ideal, does that make us all even uglier (unbeknownst to me)? I can only imagine how terrible people must feel who read such posts and actually look similar to them!
  4. It feels arbitrary and unpredictable. Why is someone suddenly ugly when they steal a ton of money? Are people going to think I smell bad because I did a bad mistake in my driving test 15 years ago? This probably just triggers childhood trauma where students alleged that I must’ve sustained in-utero brain damage because I’m afraid of violence or something. What keeps people from punishing, e.g., Rob Wiblin because they think that he caused harm by switching from CEA to 80k or whatever (80k used that as an example in some article)? That would seem super mean, un-called-for, and unproductive!
  5. One of the many ways in which I’m completely different from them is that I’m highly risk-averse. That’s something that is generally frowned upon in our communities. I feel guilty for perpetuating a norm that is merely good at some common margin and not universally and can be so catastrophic when applied universally.

This is the right kind of thinking in general, but in the specific context of this complex situation it would risk causing more problems than it would solve. The situation right now appears relatively stable and it could quickly get much worse. 

The media is combing over the catastrophe and looking for villains, because a good villain gets clicks like nothing else. There is already overwhelming evidence that weird movements like EA make for lucrative foils/characters in these sorts of stories, because ideologies orthogonal to the mainstream left and right are easy and feel satisfying for random people to hate on and reward them for clicking them.

On the other hand, if the dank EA memes page is amplifying messages in an unusually bizarre, intense, and aggressive way, then that's very concerning from a social engineering standpoint, since the hosting website is a repeat offender  and has vested interests in the AI industry.

Hi Trevor, I am really interested in the link about "it could quickly get much worse", but you seemed to have pasted the wrong thing there.

I'm not sure I agree with this. I agree that compassion is a good default, but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis, which will include many people in the 'Dank EA Memes' Facebook group. Humour can be a coping mechanism which will make some people feel better about bad situations:

"As predicted, individuals with a high sense of humor cognitively appraised less stress in the previous month than individuals with a low sense of humor and reported less current anxiety despite experiencing a similar number of everyday problems in the previous two months as those with a low sense of humor. These results support the view that humor positively affects the appraisal of stressful events and attenuates the negative affective response, and related to humor producing a cognitive affective shift and reduction in physiological arousal (Kuiper et al. 1993; Kuiper et al. 1995; Martin et al. 1993).

Maybe there is a way to use humour in a way that feels kinder, but I've personally yet to see anything since the FTX crisis started that could be defined as "compassionate" but also that made me laugh as much as those memes did.

While I agree that humour is a great de-stressor, I have faith in our ability to find alternative ways to entertain ourselves that don't involve kicking someone while they're down.

but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis

I agree with this. But I think compassion needs to be even further extended to every sentient beings who might be worse off because of this event (but this is assuming EA suffering from this event will be less good done in the world, and I recognize that there are people who seem to genuinely think that the world will be better if EA disappears from the world).  And the implication of this is that we need to think about whether these mockeries and passionate outrage are good for these sentient beings.

It might be the case that some EAs who are doing mockery or passionate outrage are doing it as a way of damage control. But from a longer-term perspective, I am not sure these mechanisms are net-good, for the reason below.

On a more general level, it seems to me that trusting and following our social norms systematically and reliably leaves out most sentient beings who deserve our compassion (future people and nonhuman animals, nonhuman animals in general, potential digital beings). And anger/disgust as mechanisms for "enforcing ethics" seems to me to be particularly dubious, if not harmful, as they often also show anger/disgust to those who don't show anger/disgust toward what most people think deserve anger or disgust, thereby reinforcing whatever norms that are already widely held, instead of an extension mechanism. Also, people can observe what things attract anger/disgust and what not, and I believe the observation will inevitably make some, if not many, people use that as evidence for how bad/important/urgent some issue is. 

On a personal level, I have tried to move away from using anger or disgust to regulate my moral thinking and my actions, or as mechanisms to change the world, and I seemed to have had some success. I used to be extremely angry with people who know the suffering of factory-farmed animals but still choose to keep fueling it, and moderately disgusted with farmed animal advocates who somehow think the suffering of animals in nature is okay. But I no longer feel these emotions as strongly as I used to. And I have to admit, I don't feel much emotional anger or disgust this time even though I think something very wrong likely has happened.


UPDATE: I saw Wixela's comment above after finishing typing this. I agree with Wixela that EAs are sometimes better off feeling what we genuinely feel, especially given that EAs already have pretty widespread and strong norms on controlling emotions and letting rationality fix our instincts/emotions. But I stand by the view that anger/disgust as mechanisms of "enforcing ethics" is pretty dubious.

I strong disendorse this. Your post and comments about this make me angry. Stop policing my emotions. I know more than enough to make a  judgement of approximately what happened and that it deserves my judgement for fucking me and many others  over. 

Actually appreciate this comment. I should've been more clear about when I was using universal vs existential quantifiers and anything in between. I do not advocate that everyone should withhold anger, because perhaps (as is likely) some people do in fact know much more than me, and they know enough that anger is justified.


I'm struggling to understand your post. You say you don't know enough about FTX to cast moral judgments. Okay, fine. But then you seem to take issue with people who do know more than you who are making memes about someone who has committed massive fraud and in all likelihood effectively robbed innocent people of billions of dollars? Are you saying no one should make memes about bad actors, or because you yourself don't feel comfortable calling SBF a bad actor other people shouldn't, or ...?

I don't object to people condemning him if they know more than me. I clarified what I meant in response to David's comment below.

If I could strongvote this even harder I would! People should stop riling themselves up about the community building implications of this until after we actually knew what happened and why!

To be fair, you have to have a very high IQ to understand dank EA memes. The humor is extremely subtle, and without a solid grasp of theoretical memetics most of the jokes will go over a typical effective altruist's head. There's also the nihilistic outlook, which is deftly woven into their characterisation - the aesthetic draws heavily from Narodnaya Volya literature, for instance. The fans understand this stuff; they have the intellectual capacity to truly appreciate the depths of these jokes, to realize that they're not just funny- they say something deep about LIFE. As a consequence EAs who don't understand dank memes truly ARE normies- of course they wouldn't appreciate, for instance, the humour in the catchphrase "thank machine doggo," which itself is a cryptic reference to Apinis' original classic I'm smirking right now just imagining one of those guileless geeks scratching their heads in confusion as genius trolley problems reveal themselves on their smartphone screens. How I pity them. 😂 And yes by the way, I DO have a dank EA memes tattoo. And no, you cannot see it. It's for the ladies' eyes only- And even they have to demonstrate that they're within 5 IQ points of my own (preferably lower) beforehand.

For people that might not know the reference, this is based on the "To Be Fair, You Have To Have a Very High IQ to Understand Rick and Morty" meme.
It's meant to be sarcastic and make fun of the "self-congratulatory way people talk [about the show]"


lol I laughed

[mod of Dank EA Memes here, but not speaking on behalf of the entire team]

I agree with parts of what you say "in theory." The memes you tagged don't seem like the most dank ones in the group, there are certainly better ones. It's unclear to what extent you are trying to cast shade on the moderation rules, mods' own opinions, or the group behavior of likes. Your comment to me in the group feels like a massive nitpick on the world "firmly," when my tagging of SBF was a very basic question. I have also mentioned that we don't condone "vigilante violence" and would start banning posts if there's too much insinuation of that. So we already have lines that we don't want to cross, maybe yours is way more strict. 

Given the above, it's unclear what your critique is proposing in a positive sense? We are not going to forgive SBF. We should not ignore the situation without looking at it. Should we make "less memes"? It's a distributed forum, many people are contributing. We are even spreading important information regarding the hack and the need to remove people's apps. Has EA forum done that? No. Why not?

As SE Montgomery mentioned humor is an important mechanism for coping, however, it's not just "coping" to create "fake feelings" of better. It is also about getting everyone on the same page about certain philosophical stances and stating that particular bad behaviors are, in fact, bad. 

It is very clear to me that EAs have massive blind spots that have led to this debacle. While the responses so far have been good, I worry that people expect them to be enough. 

As I have mentioned, extremely shady FTX funded non-profilts as such as https://www.againstpandemics.org are still operating and have not taken the correct move of employees resigning or shutting down.

I personally have a ton of disagreement with EA on the meta-level and DAEM feels like the only place I could surface them with any reach. 

[Wasn't trying to attack the FB group. I'm glad it exists.]

But I was more curious to ask: why do you say Against Pandemics is extremely shady? It's the first time I've heard of them.

I assume nonprofits and independents who were kept afloat by FTX money have to find alternative sources of funding (or otherwise be forced to shut down). But do you mean they should abstain from even doing that?

Thanks for saying you are glad the group exists, it helps me feel better. 

I am not trying to make a general case here. I am talking about "against pandemics" specifically. 

They are run by Sam's brother. This alone paints a target on them by EVERY LAW ENFORCEMENT agency. On top of that, they are involved in politics. EA's involved in politics is ok, but you've gotta be in a tip tip-top clean shape if you want to do that. I suspect what they are lobbying for might have a ton of problems and be a net-negative, but that's a separate discussion and the above points should be sufficient.

It's possible other non-profits also present a heavy reputational or even legal risk, but to find them, someone from EA needs to do a full review of the funding targets. 

We shouldn't try to shut down a nonprofit based on kin associations. This would basically amount to kin punishment, and it would create a dangerous ethical precedent. 

As for funding, probably most nonprofits in the community have received FTX funding directly or indirectly as of right now, and I don't see how that should affect how we evaluate their work.

You seem to be misreading my comment. 

Let's get back to the basics. The non-profit is 100% funded by stolen money. How much money? Who knows? I don't know how many non-profits exist solely because of FTX, but this is an important issue to come clean about. 

I am not asking something difficult, such as "giving back spent salaries". As EY pointed out, if the money was given as a salary, it was given as a salary. However, GOING FORWARD, people should not be funded with stolen funds. 

If a non-profit is funded 50/50 by FTX and OpenPhil, then yeah, having some funding shouldn't affect how we evaluated their work. However, if it's funded 100% by FTX, then we absolutely should assume the non-profit is net negative until someone else picks up the tab. 

Somehow implying that this "not using stolen funds" is somehow a punishment is 100% proof that you are not at all serious about "fraud is not ok in service of EA."

The brother issue isn't a punishment issue, it's a RISK issue. Any reasonable law enforcement / civil suits looking for co-conspirators will look at this non-profit first and foremost. This isn't even a "risk," it's a near certainty. 

I learned about this org not from EA but from a search about news on other sites with people pointing out how sketchy this is. To a  person outside of the "bubble", this looks really BAD and every person defending this looks deeply out of touch with how much hurt FTX has caused. 

There's a difference between arguing to stop using stolen funds “GOING FORWARD”, and shutting down an NGO just because it was funded by FTX in the past. The more obvious alternative is merely changing funding sources, which is what most NGOs affected by the FTX situation are already trying to do.

As for the last part, I think you're really exaggerating the risk associated to having received FTX funds. It seems extremely unlikely regulatory agencies will start investigating NGOs just for having received funds from the FTX Future Fund. Even if such risk comes from an employee being SBF's brother (which it shouldn't because most regulatory agencies aren't in the habit of persecuting kins without good suspicion), there's no point in pre-emptively shutting down an NGO.

This whole line of reasoning feels very strongly like motivated reasoning, and I don't think we should be taking decisions like shutting down NGOs lightly.

My initial guess is $200 million was given to EA, so this should be a first point to update on.

Why do you perceive ridicule to be significantly worse than other expressions of anger or social punishment? It seems you endorse condemnation here, but are strongly against the former. But ridicule is one of the most powerful forms of condemnation.

If you endorse condemnation, why not spend more breath condemning SBF than defending him? (My hypothesis for why not follows.)

Given my beliefs about the scale of his wrongdoing[1], two things follow: firstly, it’s odd to prioritise compassion for Sam Bankman-Fried so strongly over the victims, and secondly this is exactly the moment for us to cooperate on reinforcing the norms he broke. That is the value of punishment: not because it’s intrinsically good, but because of the importance of maintaining the norm. Given what I believe.

Isn’t the reason you have different feelings -- the compassion and desire to protect him, and the lack of anger and lack of desire to condemn[2] -- because you have different beliefs about what he did, rather than because you think social punishment is so often mistaken as to not apply in the case of massive norm breaking? It seems that’s more likely the crux, because I’d be surprised to hear you disavow social punishment across the board. You are uncertain whether he acted wrongly, and so you want to prevent others from jumping too far in their judgments and punishments too quickly.

In which case it’s a mistake to dwell on general and meta-level arguments about why it’s epistemically dangerous to be angry, as opposed to directly addressing the object level claim that we should still be uncertain of the facts.

I could be wrong about the crux. You talk about anger and compassion trading off, so is there some scenario in which you think anger and ridicule are appropriate?

  1. Originally I was going to say ‘defection’ rather than ‘wrongdoing’ to appeal more to your charitable posture towards SBF, but then I realised I’d be part of the problem I’m arguing against, namely, failing to condemn. ↩︎

  2. I recognise I’m interpolating your emotional response from sparse data; feel free to contradict. I am going on private data here too. ↩︎

Mh, crux is wrong. My objections are consistent with my past behaviour in similar situations.

  1. I am not categorically defending Sam from everything. I am conditionally defending him from a subset of things. Though I think his welfare is important, my primary purpose here isn't about that. (I do think his welfare matters, just as anyone should have their core dignity as a sentient being respected, regardless of who they are or what they've done.)
  2. I would write something equivalent to this post regardless of whether I believed Sam had done something unethical,[1] because I think some of the community's response was, in part, unhealthy and dangerous either way.
  3. When it involves outrage, our epistemic rigour and reluctance to defer to mass opinion should be much stricter than baseline. What happened instead was that people inferred Sam's guilt by how confidently outraged their peers were. And because in our present culture it's really hard to believably signal condemnation without also signalling confidence, this is a recipe for explosive information cascades/bandwagons. This is extremely standard psychology, and something we should as a high-trust community be unceasingly wary of. For this reason primarily, we should be very--but not infinitely--reluctant to enforce "failing to condemn" as a crime.
  4. I don't object to people condemning his actions, especially not to the people who are clearly conditionalising their condemnation on seeing specific evidence. I'm not claiming other people don't know more than me, and they might have much stronger reasons to be confident.
  5. Ridicule is more tangential to the harm, and has much more associative overlap with cruelty compared to anger and condemnation. Ridicule doesn't even pretend to be about justice (usually). If ridicule must be used, it works better as a tool for diminishing people in power, when you want them to have less power; when someone is already at the bottom, ridicule is cruelty. (Maybe the phase shift in power was so sudden that people failed to notice that they are now targeting someone who's suddenly very vulnerable to abuse.)
  1. ^

    I have my object-level probabilities, but part of my point is how expected I am to reveal them, which makes me think I should leave it ambiguous, at least in public. "Which side are you on?!" <- Any social instincts that pressure a resolution to this question should be scrutinised with utmost suspicion.


I don’t want to imply that failure to condemn is itself worthy of condemnation (except once we’re over a threshold of confidence). I do mean to say that trying to defend SBF from the small harm of ridicule by memes is a bad prioritisation of words.

It was a surprising enough decision that it made more sense to me to think you were motivated by your uncertain beliefs about his actions rather than a principled stance against ridicule. But I am willing to believe you have such a principled stance against ridicule. So now I want to argue that you shouldn’t take such a strong stance against ridicule.

If you like, please tell me in what scenarios you think outrage and ridicule are appropriate, if any. That would help to cash out what actual trade off between punishment and compassion you are recommending and I could see how far we are from agreeing.

I should clarify that the harm I envision is not mostly about Sam or others at FTX. It's the harm I imagine indirectly caused to the movement, and by the movement, by condoning insufficiently-informed bandwagons of outrage and pile-on ridicule. It harms our alignment, our epistemic norms, and our social culture; and thereby harms our ability to do good in the world.

Anger, ostracism--heck, even violence--seems less likely to misfire than ridicule. Ridicule is about having fun at another's expense, and that's just an exceedingly dangerous tool even when wielded with good intentions (which I highly doubt has been the primary motivation most people have had for using it).

(Thanks for highlighting these questions.)

More from Emrik
Curated and popular this week
Relevant opportunities