N

nadavb

328 karmaJoined

Posts
2

Sorted by New

Comments
14

Thank you for this comment!

You are absolutely right. I didn't really explore any consequences of EA being cultish other than optics. As I said in the post, I don't really have a good mental model of all the ways in which it plays out, but I do have a strong intuitive sense that it does have other bad consequences (honestly, this entire post is based on intuitions and anecdotal evidence - none of my claims are based on rigorous studies).

Having said that, here's a very partial list of other consequences that I believe exist:

1. Making people with different levels of engagement with EA feel uncomfortable (you could say it's also just optics, but I think they have good reasons to feel uncomfortable).

2. Bad epistemics, groupthink and echo chamber effects (I develop this idea a bit further here).

3. Not engaging enough with people and opinions outside EA.

4. Attracting mostly very specific types of people (again, maybe this could be labeled as optics).

5. Radical beliefs.

And just to clarify (I know that you know that, but just for the record) - I'm not saying that outreach to children is necessarily a bad idea. It has many pros and cons that should be weighed somehow. I hope that my post has been helpful in describing some potential risks. 

Yes the timing was funny (it looks like virtually everyone on the forum has something to say about EA culture nowadays :P)
I commented in your post.

Some quick impressions and thoughts:

1. I like historical anecdotes and forgotten/underappreciated pieces of history. I enjoyed learning about the Quakers and some of their achievements.

2. I agree that a lot of the discussion on whether religion is good or bad is incredibly superficial. Nowadays it's popular (among secular elites) to slam religion, but I'm quite certain that religions have played important roles in many positive developments (and, on the other hand, in many atrocities). Of course different religious groups are very different from one another and I think it's very likely that some have been net positive while others net negative (it likely also depends on what you'd consider the counterfactual alternative to religion, given that it's been so prevalent throughout most of human history).

3. It's not entirely clear to me what you suggest in this post. Do you think that EAs should embrace a more religious attitude in general (and what would it mean practically, given that religions are so different)? Or do you specifically advocate for Quakerism? (Again, what would it mean in practical terms?) Or should we just be more open to learn useful lessons from historical groups wherever they happen to present themselves? If you just intended for this post to provide some inspiration and didn't have clear action items in mind that's also perfectly fine (I just left the reading with some uncertainty about what you really tried to say).

Quantify the overall suffering from different conditions, and determine whether there's misallocation of resources in biomedical research.

I suspect there's a big gap between the distribution of resources allocated to the study of different diseases and what people actually suffer from the most. Among other factors that lead to non-optimal allocation, I'd guess that life-threatening diseases are overstudied whereas conditions that may really harm people's well-being, but are not deadly, are understudied. For example, I'd guess that chronic pain is understudied compared to how much suffering it inflicts on society. It would be valuable to quantify the overall human suffering from different conditions, and spot misallocations in biomedical research (and other societal efforts to treat these conditions). For example, a random cohort of individuals could be asked to take a survey asking what conditions they would want the most to get rid of, and how many life years they would be willing to sacrifice for it (either asking what is the maximum number of years they would be willing to sacrifice for an operation that would be guaranteed to reduce that number of years from their life expectancy and solve the condition they suffer from, or asking what's the maximum probability of dying in that operation that they would be willing to take). Given the survey's results, it should be possible to quantify the overall suffering from different conditions, and then detect mismatches between these estimates and estimates of the resources (money and talent) allocated into addressing these problems. It could also be interesting to try to address other potential reasons for mismatches between the social importance of conditions (in terms of overall well-being/suffering) and allocated resources, primarily the issue of tractability. For example, maybe condition A is causing more suffering than condition B, but it's easier to make progress on B, so we should prioritize it more. This could be figured out by interviewing experts and asking them to estimate how many resources it would take to make a given amount of progress (such as cutting the prevalence of the disease in half). I imagine that a similar style of studies could be carried out in other settings where we'd want to find out whether society's allocation of resources really reflects what people care about the most.

Thank you for writing down these good counterarguments. 

About your first and second points, that it’s a wasteful to have someone’s career dedicated to a less promising cause area, I generally agree with that, but with a few caveats (which, for the most part, just reiterate and rephrase points already made in my post):

  1. I agree there’s value in considering whole causes as more or less promising on average, but I think that this low-resolution view overlooks a lot of important nuance, and that a better comparison should consider specific opportunities that an individual has access to. I think it is entirely plausible that a better opportunity would actually present itself in a less-promising-on-average cause area.
  2. The EA community’s notion of what constitutes a promising or not-so-promising cause area could be wrong, and there is value in challenging the community’s common wisdom. I agree with your point that it’s better to assess the effectiveness of an opportunity in question without yet dedicating your entire career to it and that it’s a good idea to take a middle-ground approach between just thinking about it on the one extreme and immediately deciding to work on it for the next 40 years of your career on the other extreme. I think that trying non-conventional ideas for a short period of time (e.g. through a one-month side project or an internship program) and then reporting back to the community could be very valuable in many cases, and could also help people learn more about themselves (what they like to do and are good at).
  3. I would not urge people who are very cause neutral and EA-minded to work on a mainstream non-EA cause like curing cancer (but I would also not completely rule that out, mainly due to the “opportunity perspective” mentioned in point #1). But for people who are not that cause neutral, I would try to be more accepting of their choice than I feel the EA community currently is. As I wrote in my discussion with Aaron, I see this post being more about “we should be more accepting of non-EA causes” than “we should encourage non-EA causes”. 

 

About your last comment, I really appreciate 80k’s directness about what the scope of their activity is (and their being nonterritorial and encouraging of the presence of other orgs targeting populations that 80k don’t see as their main target audience). As an entire community (that transcends the scopes of specific orgs) I think we totally should be in the business of giving career advice to wider publics.

I like your suggestions for questions one could ask a stranger at an EA event!

About "social EA" vs. "business EA", I think I'd make a slightly different distinction. If you ask for someone else's (or some org's) time or money, then of course you need to come up with good explanations for why the thing you are offering (whether it is your employment or some project) is worthwhile. It's not even a unique feature of EA. But, if you are just doing your own thing and not asking for anyone's time or money, and just want to enjoy the company of other EAs, then this is the case where I think the EA community should be more welcoming and be happy to just let you be.

I totally agree. In order for an impact-oriented individual to contribute significantly in an area, there has to be some degree of openness to good ideas in that area, and if it is likely that no one will listen to evidence and reason then I'd tend to advise EAs to stay away from there. I think there are such areas where EAs could contribute and be heard. And I think the more mainstream the EA mindset will be, the more such places will exist. That's one of the reasons why we really should want EA to become more mainstream, and why we shouldn't hide ourselves from the rest of the world by operating in such a narrow set of domains.

Thank you for bringing this post to my attention, I really like it! We appear to make similar arguments, but frame them quite differently, so I think our two posts are very complementary.

I really like your framing of domain-specific vs. cause-neutral EA. I think you also do a better job than me in presenting the case for why helping people become more effective in what they already do might be more impactful than trying to convince them to change cause area.

Thank you Aaron for taking the time to write this detailed and thoughtful comment to my post!

I'll start with saying that I pretty much agree with everything you say, especially in your final remarks - that we should be really receptive to what people actually want and advise them accordingly, and maybe try to gently nudge them into taking a more open-minded general-impact-oriented approach (but not try to force it on them if they don't want to).

I also totally agree that most EA orgs are doing a fantastic job at exploring diverse causes and ways to improve the world, and that the EA movement is very open-minded to accepting new causes in the presence of good evidence.

To be clear, I don't criticize specific EA orgs. The thing I do criticize is pretty subtle, and refers more to the EA community itself - sometimes to individuals in the community, but mostly to our collective attitude and the atmospheres we create as groups.
 

When I say "I think we need to be more open to diverse causes", it seems that your main answer is "present me with good evidence that a new cause is promising and I'll support it", which is totally fair. I think this is the right attitude for an EA to have, but it doesn't exactly address what I allude to. I don't ask EAs to start contributing to new unproven causes themselves, but rather that they be open to others contributing to them.

I agree with you that most EAs would not confront a cancer researcher and blame her of doing something un-EA-like (and I presume many would even be kind and approach her with curiosity about the motives for her choice). But in the end, I think it is still very likely she would nonetheless feel somewhat judged. Because even if every person she meets at EA Global tries to nudge her only very gently ("Oh, that's interesting! So why did you decide to work on cancer? Have you considered pandemic preparedness? Do you think cancer is more impactful?"), those repeating comments can accumulate into a strong feeling of unease. To be clear, I'm not blaming any of the imaginary people who met the imaginary cancer researcher at the imaginary EAG conference for having done anything wrong, because each one of them tried to be kind and welcoming. It's only their collective action that made her feel off.

I think the EA community should be more welcoming to people who want to operate in areas we don't consider particularly promising, even if they don't present convincing arguments for their decisions.  

I totally agree with you that many charities and causes can be a trap for young EAs and put their long-term career in danger. In some cases I think it's also true of classic EA cause areas, if people end up doing work that doesn't really fit their skill set or doesn't develop their career capital. I think this is pretty well acknowledged and discussed in EA circles, so I'm not too worried about it (with the exception, maybe, that I think one of the possible traps is to lock someone with career capital that only fits EA-like work, thereby blocking them from working outside of EA).

As to your question, if new cause areas were substantively explored by EAs, that would mitigate some of my concerns, but not all of them. In particular, besides having community members theoretically exploring diverse causes and writing posts on the forum summarizing their thinking process (which is beneficial), I'd also like to see some EAs actively trying to work in more diverse areas (what I called the bottom-up approach), and I'd like the greater EA community to be supportive of that. 

Load more