1 min read 42

242

Hello! It’s me, a small-scale part-time EA community builder. I read The Life You Can Save in 2009 and figured that in addition to being a vegan and a social worker, I should donate 10%-plus of my income to highly effective causes. Then I connected with my local effective altruism community in 2016 and figured that I should also spend a not-insignificant portion of my waking hours encouraging and connecting other people who want to make the world a better place.

I am cheerful. I work hard. I volunteer at EAGs. I show up for the people around me.

Why? Because I think it’s the right thing to do.

But folks, I am TIRED. 

I am tired of having a few people put on pedestals because they are very smart - or very good at self-promotion. I am tired of listening to arguments about who can have the think-iest thoughts. I am tired of drama, scandals, and PR. I am tired of being in a position where I have to apologize for sexism, racism, and other toxic ideologies within this movement. I am tired of convening calls with other community builders where we try to figure out how to best react to the latest Thing That Happened. I am tired of billionaires. And I am really, really tired of seeing people publicly defend bad behavior as good epistemics.

I’m just here because I want the world to be a better, kinder, softer place. I know I’m not the only one. I’m not quitting. But I am tired.

Maybe you are tired, too.


 

242

2
1

Reactions

2
1
Comments43


Sorted by Click to highlight new comments since:

Thanks for all your hard work, Megan.

I'm reminded of this post from a few months ago: Does Sam make me want to renounce the actions of the EA community? No. Does your reaction? Absolutely.

And this point from a post Peter Wildeford wrote: "I think criticism of EA may be more discouraging than it is intended to be and we don't think about this enough."

In theory, the EA movement isn't about us as EAs. It's about doing good for others. But in practice, we're all humans, and I think it's human nature to have an expectation of recognition/gratitude when we've done an altruistic act. If instead of gratitude, we get a punishment in the form of a bad outcome or sharp words, that feels like a bait & switch.

My hypothesis is that being surrounded by other do-gooders makes the situation worse. You feel like you're in a recognition deficit, many people around you feel the same way, and no one is injecting gratitude into the ecosystem to resolve the misery spiral. Internal debates exacerbate things, insofar as trying to understand someone else's perspective depletes the same emotional resource that altruism does.

Anyway, most of that wasn't very specific to your post -- I'm just wondering if emphasizing "other-care" in addition to "self-care" would help us weather ups & downs.

And, thanks to all the EAs reading this for all the good you are doing.

ES
52
30
4

Hi Megan  - I just wanted to say thank you for writing this. For being you. For showing up. This so well encapsulates how I and so many others are feeling. I just wanted to say thank you <3

(and i'm tired too ). 

I resonated with this post a lot. Thank you for writing it.

Yeah, I empathise. 

burner
29
19
29

I am very surprised by the warm reception to this post. To my mind, this is exactly the type of rhetoric we should be discouraging on the Forums. It's insinuating all kinds of scandals

(I am tired of drama, scandals, and PR. I am tired of being in a position where I have to apologize for sexism, racism, and other toxic ideologies within this movement)

without making any specific allegations or points, which becomes somehow acceptable within the emotional frame of "I am TIRED." Presumably many other people, including those directly impacted by these things, are tired too, and we need to use reason to adjudicate how we should respond.  

I had a negative reaction to the post but felt hesitant to reply because of the emotional content. It does suck what the OP is experiencing - I think they (and others) could make less of their identity be about the EA movement and that this would be a good thing. I don't like that 'small-scale EA community builders' are having to apologise for things others into EA have done or having to spend time figuring out how to react to EA drama. That does seem like a waste of time and emotional energy, and also unnecessary. 

Unfortunately, issues like the FTX debacle or the Bostrom stuff recently might be a significant amount of a prospective EA's experience about EA because other aspects may not have penetrated to his/her news sources. Even a small-scale community builder might want some good answers in the face of troubling news one has been in contact with.

Fairs. I think in FTX worlds, it should actually be in fact harder to get people who strongly dislike fraud to get on board with EA  and in Bostrom email worlds, it should actually be in fact harder to get people who strongly dislike the apology to get on board with EA. And that this difficulty, to the extent we care about people turned off by either event having favourable opinion of EA, is actually right and just. 

Thank you for your hard work, Megan. Being a social worker sounds hard in itself to me. I think it's extremely generous and impressive that in addition to that you volunteer, and donate, and are vegan. That does sound exhausting, even without the community dramas that feel kind of incessant at the moment. 

It feels really tiring to me too. EA itself feels tiring. It's tiring that there's always more to do and that I could always be prioritising harder and helping others more. A thing that makes that bearable to me is working together - having a community supporting each other other in trying to do good but also giving each other a break. It feels horrid to instead have people angry at each other, and getting fire from outside the community too. I hope your local community is supportive. I hope you guys are able to help each other to focus on the good and to take breaks, and deeply care about each other, while you're all improving the lives of others.  I hope some of the appreciative, caring messages people have left in response to this post have helped the burden feel a little lighter.  

How much drama/scandal has there actually been in EA beyond FTX (admittedly a huge scandal) and now Bostrom (in my opinion something that doesn't need to be given all that much oxygen)?

Serious question.

There were also these, cults.

Thank you for writing this Megan. I don't even consider myself liberal or left but it is quite exhausting, as a community builder, to explain to friends, colleagues at work, and the fellowship members, that EA is a method of thinking and a goal of doing the most good and not shady billionaire schemes, self-promotion, and justifying lack of sensitiveness with rationality. So there is not much epistemic value to my comment but I just wanted to express that I am tired as well. 

I strongly felt the same when I opened the forum today. When reading the post titles mentioning "some apology email of Bostrom" I figured it couldn't have been anything good and my first thought was "Oh god, not again". I'd love to think about and discuss how to make the world the best place it can be. I'd also love to spread the EA ideas to other people, but the bar to then also join the community is high and the ideas once you're beyond the principles are weird and for many, difficult to grasp. When I first learned about EA, I thought of it as a very elite thing. I'd love to be able to say that this is no longer the case, but sadly, it isn't.  I've been thinking about starting an EA University group, but the feelings expressed above have made me doubt doing so. The "drama" (apology for not finding a better word) of today and the previous months makes me doubt even more. I don't want to represent a community that is about apologizing for other people's actions, nor do I want to invite other people into such a community.

I'm tired, too. I'm tired of constantly having to watch what I say or don't say because of the fear that some mob will descend on me and try to ruin my life as they are now trying to do to Nick Bostrom. Perhaps someone in your situation can find it in themselves to have sympathy for people in my position and not just people in yours.

It seems a number of people agree with your sentiment, throwaway791, so I'm choosing to respond:

Most people do not experience a "mob descending on them and trying to ruin their life," especially for merely something they said, and especially for something they couldn't have anticipated—perhaps by first getting input from a few trusted friends or colleagues—would cause life-altering backlash.

If this is a sensation you genuinely routinely find yourself experiencing such that it is causing you exhaustion, it sounds horribly stressful and I'm very sorry you are going through this. I would encourage you to consider speaking to a mental health professional because my layperson perspective is that this is an exaggerated fear that may be indicative of larger issues around anxiety.

That aside, words and the concepts they perpetuate have consequences. Those consequences can range from hurt feelings to genocidal ideologies taking root. I consistently express views that are outside the social norm, but I don't share your fear. I think this is because (1) I am never "an edgelord", (2) I don't often express ideas that are likely to harm others, and (3) I spend enough time learning about the world and others' views and experiences to trust I am calibrated well to what is incendiary and why. If you're spending a great deal of time watching what you say and experiencing uncertainty regarding how others will respond, maybe you're experiencing a calibration issue. If so, this seems very addressable by exposing yourself to more areas of discourse, cultivating and practicing empathy, and doing more listening than speaking.

I consistently express views that are outside the social norm, but I don't share your fear.

What you're missing here is the concept of heresy.

Plenty of views that are outside the social norm are not considered heretical. For instance, if you don't like a movie that most people like or vice versa, that view is going to be outside the social norm but not in a way that means people who express it are punished for their opinions.

The problem is that some true things are considered heretical, and therefore you must be cautious in how much you reveal of what you really believe, even in private as such private communications can end up being publicized in a way that's out of your control.

That aside, words and the concepts they perpetuate have consequences. Those consequences can range from hurt feelings to genocidal ideologies taking root.

Yes, I'm sure Catholics in 16th century Spain also believed that "words and the concepts they perpetuate have consequences", for example such as God deciding to punish your civilization by causing crop harvests to fail three years in a row because you were insufficiently pious and didn't suppress opponents of Catholicism sufficiently strongly.

You can always tell stories about how people expressing views you don't like will have bad consequences. Most of these stories are utter hogwash and are nothing more than exercises in self-delusion. Please keep the past record of such storytelling in mind when you make claims of alleged harm supported only by poor evidence and implicit intuitions you're unable to share with others.

If you're spending a great deal of time watching what you say and experiencing uncertainty regarding how others will respond, maybe you're experiencing a calibration issue.

I know exactly how others will respond. My uncertainty is about what I have said where and when will suddenly become the subject of attention by people who wish to attack me. There's no calibration issue if you're afraid to declare you're an atheist in 16th century Spain, just a well-calibrated recognition that doing this will be very bad for you.

It is difficult to engage with you because the potential range of beliefs you could be referring to is so vast, there is no context aside from the Bostrom email, and you're an anonymous account so there is even less context. Regardless, it sounds like this still comes down to a question of why making (allegedly) truthful statements would lead to life-altering backlash and I see a distinction between "because it threatens the norm and people see it as heretical" and "because  it is tied to ideologies that have caused vast suffering." I also would want to know the purpose of making the incendiary statements: Are they helpful in actually improving the world? Or are they promoting a truth for truth's sake?

Fwiw, many of the views I express would accurately be described as heretical, by your definition. And I have personally experienced a mob descending and attempting to ruin my life as a result.

Valid.

Thanks for expressing your position.

Sorry you're in this situation! :(

 

I want to allow my problem solver side to say a few things, but, eh, if that's not what you want, totally skip my comment

 

So:

[reminder this is ignorable] 

Maybe community building isn't a good fit for you now, in a time where it involves all these things you don't like? 

  1. Or maybe you could do community building in some niche (like.. run the EA Product Managers group or something, if they don't have someone doing that) where you're not exposed to the parts you don't like?
  2. My cold take is that too many EAs are in roles they find tiring/stressful/something, and that these EAs can find another role that is (in my interpretation of 80k's words) a better fit for them.
    1. I personally see software developers who want to be data scientists, and data scientists who want to be software developers, and each think they've got to do the not-fun-thing. Maybe you're somehow in this situation? I know at least one person who is fascinated by preventing and managing PR problems. I personally don't get it, but these people do exist. I know another person mission driven (in a good way) about reducing the.. toxicity. Maybe there's something you're excited about that you don't understand why not everybody else wants to do?
  3. A reminder that I know nothing about you, this is a long shot guess + pattern matching. Ignoring my comment is totally a legit move. (or maybe someone else who resonates with your post would resonate with this comment, I hope)

<3

[This comment is no longer endorsed by its author]Reply

I think you're actually right here. A couple of points stand out to me: 

  1. Megan says she still wants to build a movement that 'wants the world to be a better, kinder, softer place'
  2. EA, like it or not, probably isn't that movement- EA is a movement that mashes together lots of different counterintuitive, often edgy ideas, makes trade-offs that clash with more mainstream ethical views, relies on often obscure philosophies and the work of a few very clever, fairly weird people... and is funded to a pretty major degree by those billionaires she's so tired of.  

I suspect that she really wants the EA movement to be something it's not, and finds the cognitive dissonance of trying to build this movement particularly stressful. As you mention, some people just enjoy PR and some people might just be happy defending a movement that they have major disagreements with, because the alternative is worse (political parties seem the obvious example) but Megan doesn't seem to be in either category. 

Personally, I feel more comfortable with the way EA is, and I acknowledge the trade-offs-  when I do community building (less frequently), I don't feel tired by any of the issues mentioned. I tend to take the fairly consistent line that, if a certain scandal or drama within EA makes the movement something that you really don't want to identify with, you're probably not right for EA (but I'm happy talking about recommended charities etc.). I don't feel like I've ever been 'in a position where I have to apologize for sexism, racism, and other toxic ideologies within this movement'. 

I think you've entirely misidentified the point. OP is not tired of community-building, but of the way that EA elevates certain people and the problems left in its wake. "Cult of personality" as it's commonly known. EA might have less problems with optics if it wasn't for people elevating people like Bostrom and SBF to these ridiculous heights. Community-building is probably fine, damage control is what sucks.

Phrases like "EA elevates people" are becoming common, but it is very unclear what it means. Nick Bostrom created groundbreaking philosophical ideas. Will MacAskill has written extremely popular books and built communities and movements. Sam Bankman Fried became the richest man under 30 in a matter of months. All of these people have influenced and inspired many EAs because of their actions. 

Under any reasonable sense of the word, people are elevating themselves. I think EA is incredibly free from 'cult of personality' problems - in fact it's amazing how quickly people will turn against popular EAs. But in any group, some people are going to get status for doing their work well. 

Does Bostrom actually have a cult of personality/is elevated to ridiculous heights?

He doesn't have a Twitter account (or any other social media presence as far as I'm aware), doesn't participate on EA (or EA adjacent) forums, doesn't blog frequently and doesn't do media tours to promote himself.

 

Is this necessarily an EA optics problem?

The Times article on the controversy mentions "Oxford don", in the headline, and there was no mention of "effective altruism" in the body of the article.

I expect the mainstream zeitgeist on this article to be more about Bostrom's Oxford connection than his effective altruism connection.

 
I'm unconvinced that:

  1. EA has a Bostrom specific optics problem
  2. Bostrom has a cult of personality within EA

Thanks for explaining, retracted

Retracted even the heart

Surely the heart is still endorsed by the author!

It became a broken heart after retraction! 💔

Surely, damage control and crisis management is part of community building work? 

My understanding is that it's not that she never wants to do damage control and crisis management - but that she is tired of constantly having to do it and the fact that it crowds out the other aspects of EA and Community-Building

I empathise with this a lot, and know many others who do too.

Thanks for sharing where you're at and also for all the effort you've put in and for not quitting <3

girl*, same

*or however you identify

I share this feeling. I feel like EA has trended in the direction of some other groups I've dealt with where the personalities and interpersonal issues of a small number of people at the top come to be overly dominant. 

I've also had my faith in the movement fractured a bit by seeing how much of how things were run seems to be based on friends of friends networks. I had naively assumed they were doing the kind of due diligence and institutional division of power that other charitable organisations do.

A lot of this isn't a particular specific set of issues, but its a general sense of ones estimates of people being shifted downward

Thanks for your thoughts here.

Thank you for writing this.

I don't do community building work, but I too am tired of these things being so present, and even accepted. Other than some specific projects that seem good to me (e.g. GiveWell, Charity Entrepreneurship), I'm having a harder and harder time sticking to the view that the movement as it exists is expected to have a big positive impact.

I strongly disagree with your implication that "these things" (presumably "sexism, racism, and other toxic ideologies" as mentioned in the original post) are "accepted" within this movement, and I'm tired of stuff like this being brought up and distracting us from the mission we're all here for, which is to help others.

This. I do not think Nick Bostrom made a good apology at all for his views, but I do like CEA's response here to link it:

https://forum.effectivealtruism.org/posts/ALzE9JixLLEexTKSq/cea-statement-on-nick-bostrom-s-email

Rich and powerful people have hijacked this movement.

I want EA to be a movement about ambitiously working towards a brighter future for humanity.

To that end, it's a feature not a bug that some EAs are rich/powerful and that EA attracts some of those kinds of people.

Curated and popular this week
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to
 ·  · 4m read
 · 
Forethought[1] is a new AI macrostrategy research group cofounded by Max Dalton, Will MacAskill, Tom Davidson, and Amrit Sidhu-Brar. We are trying to figure out how to navigate the (potentially rapid) transition to a world with superintelligent AI systems. We aim to tackle the most important questions we can find, unrestricted by the current Overton window. More details on our website. Why we exist We think that AGI might come soon (say, modal timelines to mostly-automated AI R&D in the next 2-8 years), and might significantly accelerate technological progress, leading to many different challenges. We don’t yet have a good understanding of what this change might look like or how to navigate it. Society is not prepared. Moreover, we want the world to not just avoid catastrophe: we want to reach a really great future. We think about what this might be like (incorporating moral uncertainty), and what we can do, now, to build towards a good future. Like all projects, this started out with a plethora of Google docs. We ran a series of seminars to explore the ideas further, and that cascaded into an organization. This area of work feels to us like the early days of EA: we’re exploring unusual, neglected ideas, and finding research progress surprisingly tractable. And while we start out with (literally) galaxy-brained schemes, they often ground out into fairly specific and concrete ideas about what should happen next. Of course, we’re bringing principles like scope sensitivity, impartiality, etc to our thinking, and we think that these issues urgently need more morally dedicated and thoughtful people working on them. Research Research agendas We are currently pursuing the following perspectives: * Preparing for the intelligence explosion: If AI drives explosive growth there will be an enormous number of challenges we have to face. In addition to misalignment risk and biorisk, this potentially includes: how to govern the development of new weapons of mass destr
Sam Anschell
 ·  · 6m read
 · 
*Disclaimer* I am writing this post in a personal capacity; the opinions I express are my own and do not represent my employer. I think that more people and orgs (especially nonprofits) should consider negotiating the cost of sizable expenses. In my experience, there is usually nothing to lose by respectfully asking to pay less, and doing so can sometimes save thousands or tens of thousands of dollars per hour. This is because negotiating doesn’t take very much time[1], savings can persist across multiple years, and counterparties can be surprisingly generous with discounts. Here are a few examples of expenses that may be negotiable: For organizations * Software or news subscriptions * Of 35 corporate software and news providers I’ve negotiated with, 30 have been willing to provide discounts. These discounts range from 10% to 80%, with an average of around 40%. * Leases * A friend was able to negotiate a 22% reduction in the price per square foot on a corporate lease and secured a couple months of free rent. This led to >$480,000 in savings for their nonprofit. Other negotiable parameters include: * Square footage counted towards rent costs * Lease length * A tenant improvement allowance * Certain physical goods (e.g., smart TVs) * Buying in bulk can be a great lever for negotiating smaller items like covid tests, and can reduce costs by 50% or more. * Event/retreat venues (both venue price and smaller items like food and AV) * Hotel blocks * A quick email with the rates of comparable but more affordable hotel blocks can often save ~10%. * Professional service contracts with large for-profit firms (e.g., IT contracts, office internet coverage) * Insurance premiums (though I am less confident that this is negotiable) For many products and services, a nonprofit can qualify for a discount simply by providing their IRS determination letter or getting verified on platforms like TechSoup. In my experience, most vendors and companies
Recent opportunities in Building effective altruism
32
CEEALAR
· · 1m read