New comment
33 comments, sorted by Click to highlight new comments since: Today at 5:48 AM

Just some ranty thoughts about EA university groups without any suggestions. Don't take too seriously.

Lots of university EA group organisers I have met seem to not be very knowledgeable about EA.  A common type is someone who had gotten involved for social reasons and uses EA terms in conversations but doesn't really get it. I can imagine this being offputting to the types of people these groups would like to join. Probably this is less of a problem at top universities though.

It also feels awkward to mention this to people because I know these group organisers have good intentions but they may be turning off cool people from engaging with the EA groups. It is even more awkward when community building is their part-time job. It's not that they're bad, it's just that I wouldn't be excited about a promising student first coming across EA by interacting with them. 

Less confidently, it seems like in some groups there is too much of an emphasis on being very agenty right away and making big projects happen (especially community-building projects) compared to having a culture of intellectual curiosity and prioritising making interesting conversations happen that are not about community building. It also feels like for young people in EA, there are strong incentives to network hard, go to Berkeley, go to a bunch of retreats so you have cool important EA friends and all of this cuts down the time you can just sit down and learn important things, skill up and introspect. 

There have been posts on the EA forum pointing at similar things so it feels like the situation might become better over the next year but this is just me recording what my experience has been like at times. 

I hope more people, especially EA community builders, take some time to reevaluate the value of growing the EA movement and EA community building. Seems like a lot of community builders are acting as if "making more EAs" is good for its own sake. I'm much less sure about the value of growing the EA community building and more uncertain about whether it is positive at all. Seems like a lot of people are having to put in energy to do PR, make EA look good, fight fires in the community when their time could be better spent directly focusing on how to solve the big problems.

But I also think directly focusing on how to solve the big problems is difficult and "get more people into EA and maybe some of them will know how to make progress" feels like an easy way out. 

My intuition is that having more people does mean more potential fires could be started (since each person could start a fire), but it also means each fire is less damaging in expectation as it's diluted over more people (so to speak). For instance, the environmentalist movement has at times engaged in ecoterrorism, which is (I think pretty clearly) much worse than anything anyone in EA has ever done, but the environmentalist movement as a whole has generally weathered those instances pretty well as most people (reasonably imho) recognize that ecoterrorists are a fringe within environmentalism. I think one major reason for this is that the environmentalist movement is quite large, and this acts as a bulwark against the entire movement being tarred by the actions of a few.

I guess I make comments like the one I made above because I think fewer people doing EA community building are seriously considering that the actual impact (and expected impact) of the EA movement could be net negative. It might not be, and I'm leaning towards it being positive but I think it is a serious possibility that EA movement causes more harm than good overall, for example via having sped up AI timelines due to DeepMind/OpenAI/Anthropic and a few of the EA community members committing one of the biggest frauds ever. Or more vague things like EAs fuck up cause prioritisation, maximise really hard, and can't course correct later. 

The reason why EA movement could end up being not net harmful is when we are ambitious but prioritise being correct and having good epistemics really hard. This is not the vibe I get when I talk to many community builders. A lot of them seem happy with "make more EAs is good" and forget that the mechanism for EA being positively impactful relies pretty heavily  on our ability to steer correctly. I think they've decided too quickly that "EA movement good therefore I must protect and grow it". I think EA ideas are really good, less sure about the movement. 
 

If EA is net harmful then people shouldn't work directly on solving problems either, we should just pack up and go home.

I like EA ideas, I think my sanely trying to solve the biggest problems is a good thing. I am less sure about the current EA movement, partly because of the track record of the movement so far and partly because of intuitions that movements that are as into gaining influence and recruiting more people will go off track and it doesn't to me look like there's enough being done to preserve people's sanity and get them to think clearly in the face of the mind-warping effects of the movement. 

I think it could both be true that we need a healthy EA (or longtermist) movement to make it through this century and that the current EA movement ends up causing more harm than good. Just to be clear, I currently think that in the current trajectory, the EA movement will end up being net good but I am not super confident in this. 

Also, sorry my answer is mostly just coming from thinking about AI x-risk stuff rather than EA as a whole. 

EA didn't cause the FTX fraud.

Huh, not sure what you mean. Sure seems like the FTX fraud was committed by prominent EAs, in the name of EA principles, using the resources of the EA movement. In as much as EA has caused anything, I feel like it has caused the FTX fraud. 

Like, by the same logic you could be like "EA didn't cause millions of dollars to be allocated to malaria nets". And like, yeah, there is something fair about that, in the sense that it was ultimately individual people or philanthropists who gave money to EA causes, but at the end of the day, if you get to take some credit for Dustin's giving, you also have to take some blame for Sam's fraud.

For instance, the environmentalist movement has at times engaged in ecoterrorism, which is (I think pretty clearly) much worse than anything anyone in EA has ever done

Alas, I do think this defense no longer works, given FTX, which seems substantially worse than all the ecoterrorism I have heard about (and IMO also the capabilities research that's downstream of our work, like RLHF being the primary difference between Chat-GPT and GPT-3, but that's a longer argument, and I wouldn't want to bring it up as a commonly-acknowledged point)

[-][anonymous]2mo10

Alas, I do think this defense no longer works, given FTX, which seems substantially worse than all the ecoterrorism I have heard about.

I disagree with this because I believe FTX's harm was way less bad than most ecoterrorism, primarily because of the disutility involved. FTX hasn't actually injured or killed people, unlike a lot of ecoterrorism. It stole billions, which isn't good, but right now no violence is involved. I don't think FTX is good, but so far no violence has been attributed or even much advocated by EAs.

[This comment is no longer endorsed by its author]Reply

Yeah, doesn't seem like a totally crazy position to take, but I don't really buy it. I bet a lot of people would take a probability of having violence inflicted on them in exchange for $8 billion dollars, and I don't think this kind of categorical comparison of different kinds of harm checks out. It's hard to really imagine the scale of $8 billion dollars, but I am confident that Sam's action have killed, indirectly via a long chain of actions, but nevertheless directly responsibly, at least 20-30 people, which I think is probably more than any ecoterrorism that has been committed (though I am not that confident about the history of ecoterrorism, so maybe there was actually something that got to that order of magnitude?)

[-][anonymous]2mo10

IMO I think Ecoterrorism's deaths were primarily the Unabomber, which was at least 3 deaths and 23 injuries. I may retract my first comment if I don't have more evidence than this.

[This comment is no longer endorsed by its author]Reply

The unabomber does feel kind of weird to blame on environmentalism. Or like, I would give environmentalism a lot less blame for the unabomber than I would give us for FTX.

There are  different ways to approach telling people about effective altruism (or caring about the future of humanity or AI safety etc):

  • "We want to work on solving these important problems. If you care about similar things, let's work together!"
  • "We have figured out what the correct things to do are and now we are going to tell you what to do with your life"

It seems like a lot of EA university group organisers are doing the second thing, and to me, this feels weird and bad. A lot of our disagreement about specific things, like how I feel it is icky to use prepared speeches written by someone else to introduce people to EA and bad to think of people who engage with your group in terms of where they are in some sort of pipeline, is about them thinking about things in that second frame.

I think the first framing is a lot healthier, both for communities and for individuals who are doing activities under the category of "community building". If you care deeply about something (eg: using spreadsheets to decide where to donate, forming accurate beliefs, reducing the risk we all die due to AI, solving moral philosophy, etc) and you tell people why you care and they're not interested, you can just move along and try to find people who are interested in working together with you in solving those problems. You don't have to make them go through some sort of pipeline where you start with the most appealing concepts to build them up to the thing you actually want them to care about. 

It is also healthier for your own thinking because putting yourself in the mindset of trying to persuade others, in my experience, is pretty harmful. When I have been in that mode in the past, it crushed my ability to notice when I was confused. 

I also have other intuitions for why doing the second thing just doesn't work if you want to get highly capable individuals who will actually solve the biggest problems but in this comment, I just wanted to point out the distinction between the two ways of doing things. I think they are distinct mindsets that lead to very different actions. 

My defense of posting pseudonymously:

It does feel like I'm defecting a little bit by using a pseudonymous account. I do feel like I'm somewhat intentionally trying to inject my views while getting away with not paying the reputational cost of having them.  

My comments use fewer caveats than they would if I were posting under my real name, and I'm more likely to blurt things I currently think without spending lots of time thinking about how correct I am. I also feel under little obligation to signal thoughtfulness and niceness when not writing using my name. Plausibly this contributes to lowering the quality of the EA forum but I have found it helpful as a (possibly temporary?) measure to practise posting anything at all. I think that I have experiences/opinions that I want others on the EA forum to know about but don't want to do the complicated calculation to figure out if it is worth posting them under my real name (where a significant part of the complicated calculation is non-EA people coming across them while searching for me on the internet). 

I also prefer the situation where people in the EA community can anonymously share their controversial views over the situation where they don't say anything at all because it makes it easier to get a more accurate pulse of the movement. I mostly have spent lots of time in social groups where saying things I think are true would have been bad for me and I do notice that it did cause my thinking to be a bit stunted as I avoid thinking thoughts that would be bad to share. Writing pseudonymously feels helpful for noticing that problem and practising thinking more freely. 

Also idk pseudonyms are really fun to use, I like the semi-secret identity aspect of using them. 

Yeah, pseudonyms are great. There's been recent debates about people using one-off burner accounts to make accusations, but those don't reflect at all on the merits of using durable pseudonyms for general conversation.

The degree of reputation and accountability that durable pseudonyms provide might be less than using a wallet name, but it's still substantial, and in practice it's a perfectly sufficient foundation for good discourse.

As someone who is pretty far on the anti-pseudonym side of the debate, I think your point about caveats and time-saved is a real concern

Idk this just occurred to me though.. what about norms of starting a comment with:

epistemic status: blurted

or

epistemic status: halp I just felt someone needed to say this thing, y'all pls help me decide if true

And couldn't that be fun too, maybe? If you let it be so?

Yeah, that does seem useful. 

I still think I've found being pseudonymous more useful than writing under my name. It does feel like I'm less restricted in my thinking because I know there are no direct negative or positive effects on me personally for sharing my thoughts. So for example, I've found it easier to express genuine appreciation for things or people surprisingly. Perhaps I'm too obsessed with noticing how the shape of my thoughts changes depending on how I think they will be perceived but it has been very interesting to notice that. Like it genuinely feels like there are more thoughts I am allowed to think when I'm trying on a pseudonym (I think this was much starker a few months ago so maybe I've squeezed out most of the benefit by now).  

Maybe folks funding community building at universities should make the option of: "doing paid community building stuff at the start of term for a couple of months and then focusing on personally skilling up and learning more things for the rest of the year" more obvious. For a lot of group organisers, this might be a better idea than consistently spending 10 hours/week group organising throughout the year. 

 Since EAG: SF is coming up and not all of us will get to talk to everyone it would be valuable for us to talk to, it would be cool for people to document takeaways from specific conversations they had with people (with those people's consent) rather than just takeaways of the entire conference, which people have posted on the forum in the past. 

This is an example of what I meant: https://www.lesswrong.com/posts/aan3jPEEwPhrcGZjj/nate-soares-life-advice

I wish there were more nudges to make posts like that after EAGs
 

From https://forum.effectivealtruism.org/posts/9Y6Y6qoAigRC7A8eX/my-take-on-what-we-owe-the-future#Thoughts_on_the_balance_of_positive_and_negative_value_in_current_lives:

I feel like I basically have no idea, but if I had to guess I’d say ~40% of current human lives are net-negative, and the world as a whole is worse than nothing for humans alive today because extreme suffering is pretty bad compared to currently achievable positive states. This does not mean that I think this trend will continue into the future; I think the future has positive EV due to AI + future tech.

I share these intuitions and it is a huge part of the reason reducing x-risk feels so emotionally compelling to me. It would be so sad for humanity to die out so young and unhappy never having experienced the awesome possibilities that otherwise lie in our future. 

Like, the difference between what life is like and the sorts of experiences we can have right now and just how good life could be and the sorts of pleasures we could potentially experience in the future, is so incredibly massive

Also feeling like the only way to make up for all the suffering people experienced in the past and are experiencing now and the suffering we inflict on animals is to fill the universe with good stuff. Create so much value and beautiful experiences and whatever else is good and positive and right, so that things like disease, slavery, the torture of animals seem like a distant and tiny blot in human history.