All of nadavb's Comments + Replies

Thank you for this comment!

You are absolutely right. I didn't really explore any consequences of EA being cultish other than optics. As I said in the post, I don't really have a good mental model of all the ways in which it plays out, but I do have a strong intuitive sense that it does have other bad consequences (honestly, this entire post is based on intuitions and anecdotal evidence - none of my claims are based on rigorous studies).

Having said that, here's a very partial list of other consequences that I believe exist:

1. Making... (read more)

Yes the timing was funny (it looks like virtually everyone on the forum has something to say about EA culture nowadays :P)
I commented in your post.

Some quick impressions and thoughts:

1. I like historical anecdotes and forgotten/underappreciated pieces of history. I enjoyed learning about the Quakers and some of their achievements.

2. I agree that a lot of the discussion on whether religion is good or bad is incredibly superficial. Nowadays it's popular (among secular elites) to slam religion, but I'm quite certain that religions have played important roles in many positive developments (and, on the other hand, in many atrocities). Of course different religious groups are very diffe... (read more)

2
Grayden
1y
I think it's important to consider the counterfactual when considering the impact of religious groups. For example, many religious terrorists may simply be using religion to try to claim moral authority when the reality often is that their behavior contradicts what the religion teaches. Without religion, they might still be terrorists. I also think that a lot of the positives are not reported / downplayed to fit a secular narrative, e.g. the anti-slavery movement relied on the church.
5
Lawrence Newport
1y
Thanks for this - really appreciate your thoughts! On 3 - I think seriously examining what made Quaker membership so impactful, as well as ahead of the moral curve, is something we should consider as a community. I think we should consider that various culture parts of Quakerism may have contributed meaningfully to their productivity - for example, I do wonder if EA meetups that emphasise silence, with occassional spoken words or passages read aloud by members who felt compelled, would actually have a bunch of unknown positive effects to the quality of debate and ideas. I am definitely uncertain about what this would mean in a multitude of ways but I do think emulation means that you can improve a community through grabbing a series of positives that you might not, through a priori reasoning, realise are positives. Things that seem unnecessary might be very important - and we should be open to historical precedents to see if we can try any of these (at least particularly low cost examples) and see if we find a bunch of unintended positive results.

Quantify the overall suffering from different conditions, and determine whether there's misallocation of resources in biomedical research.

I suspect there's a big gap between the distribution of resources allocated to the study of different diseases and what people actually suffer from the most. Among other factors that lead to non-optimal allocation, I'd guess that life-threatening diseases are overstudied whereas conditions that may really harm people's well-being, but are not deadly, are understudied. For example, I'd guess that chronic pain is understud... (read more)

2
EdoArad
2y
Related: Cochrane's series of papers on waste in science and Global Priorities Project's investigation into the cost-effectiveness of medical research

Thank you for writing down these good counterarguments. 

About your first and second points, that it’s a wasteful to have someone’s career dedicated to a less promising cause area, I generally agree with that, but with a few caveats (which, for the most part, just reiterate and rephrase points already made in my post):

  1. I agree there’s value in considering whole causes as more or less promising on average, but I think that this low-resolution view overlooks a lot of important nuance, and that a better comparison should consider specific opportunities tha
... (read more)
4
Gina_Stuessy
3y
Agree on all points :) And thank you, again, for bringing up this issue of acceptance.

I like your suggestions for questions one could ask a stranger at an EA event!

About "social EA" vs. "business EA", I think I'd make a slightly different distinction. If you ask for someone else's (or some org's) time or money, then of course you need to come up with good explanations for why the thing you are offering (whether it is your employment or some project) is worthwhile. It's not even a unique feature of EA. But, if you are just doing your own thing and not asking for anyone's time or money, and just want to enjoy the company of other EAs, then this is the case where I think the EA community should be more welcoming and be happy to just let you be.

I totally agree. In order for an impact-oriented individual to contribute significantly in an area, there has to be some degree of openness to good ideas in that area, and if it is likely that no one will listen to evidence and reason then I'd tend to advise EAs to stay away from there. I think there are such areas where EAs could contribute and be heard. And I think the more mainstream the EA mindset will be, the more such places will exist. That's one of the reasons why we really should want EA to become more mainstream, and why we shouldn't hide ourselves from the rest of the world by operating in such a narrow set of domains.

Thank you for bringing this post to my attention, I really like it! We appear to make similar arguments, but frame them quite differently, so I think our two posts are very complementary.

I really like your framing of domain-specific vs. cause-neutral EA. I think you also do a better job than me in presenting the case for why helping people become more effective in what they already do might be more impactful than trying to convince them to change cause area.

Thank you Aaron for taking the time to write this detailed and thoughtful comment to my post!

I'll start with saying that I pretty much agree with everything you say, especially in your final remarks - that we should be really receptive to what people actually want and advise them accordingly, and maybe try to gently nudge them into taking a more open-minded general-impact-oriented approach (but not try to force it on them if they don't want to).

I also totally agree that most EA orgs are doing a fantastic job at exploring diverse causes and ways to imp... (read more)

5
Aaron Gertler
3y
I like this example! It captures something I can more easily imagine happening (regularly) in the community. One proposal for how to avoid this collective action problem would be for people to ask the same sorts of questions, no matter what area someone works on (assuming they don't know enough to have more detailed/specific questions). For example, instead of: * Have you considered X? * Do you think your thing, Y, is more impactful than X? You'd have questions like: * What led you to work on Y?  * And then, if they say something about impact, "Were there any other paths you considered? How did you choose Y in the end?" * What should someone not involved in Y know about it? * What are your goals for this work? How is it going so far? * What are your goals for this event? (If it's a major event and not e.g. a dinner party) These should work about equally well for people in most fields, and I think that "discussing the value/promise of an area" conversations will typically go better than "discussing whether a new area 'beats' another area by various imperfect measures". We still have to take the second step at some point as a community, but I'd rather leave that to funders, job-seekers, and Forum commentators. Depends on the context. Plenty of people in the EA space are doing their own thing (disconnected from standard paths) but still provide interesting commentary, ask good questions, etc. I have no idea what some Forum users do for work, but I don't feel the need to ask. If they're a good fit for the culture and the community seems better for their presence, I'm happy. The difficulty comes when certain decisions have to be made — whose work to fund, which people are likely to get a lot of benefit from EA Global, etc. At that point, you need solid evidence or a strong argument that your work is likely to have a big impact. In casual settings, the former "vibe" seems better — but sometimes, I think that people who thrive in casual spaces get frustr

I totally agree with you that many charities and causes can be a trap for young EAs and put their long-term career in danger. In some cases I think it's also true of classic EA cause areas, if people end up doing work that doesn't really fit their skill set or doesn't develop their career capital. I think this is pretty well acknowledged and discussed in EA circles, so I'm not too worried about it (with the exception, maybe, that I think one of the possible traps is to lock someone with career capital that only fits EA-like work, thereby blocking them... (read more)

Thank you for sharing your thoughts!

About your second point, I totally agree with the spirit of what you say, specifically that:

1. Contrary to what might be implied from my post, EAs are clearly not the only ones who think that impact, measurement and evidence are important, and these concepts are also gaining popularity outside of EA.

2. Even in an area where most current actors lack the motivation or skills to act in an impact-oriented way, there are more conditions that have to be met before I would deem it high-impact to work in this area. In particular... (read more)

2
Charles He
3y
Thanks for the thoughtful reply. I think we are probably agreed that we should be cautious against prescribing EAs to go to charities or cause areas where the culture doesn't seem welcoming. Especially given the younger age of many EAs, and lower income and career capital produced by some charities, this could be a very difficult experience or even a trap for some people. I think I have updated based on your comment. It seems that having not just acceptance but also active discussion or awareness of "non-canonical" cause areas seems useful. I wonder, to what degree is your post or concerns addressed if new cause areas were substantively explored by EAs to add to the "EA roster"? (even if few cause areas were ultimately "added" as a result, e.g. because they aren't feasible).

I agree that when you first present EA to someone, there is a clear limitation on how much nuance you can squeeze in. For the sake of being concrete and down to earth, I don't see harm in giving examples from classic EA cause areas (giving the example of distributing bed nets to prevent malaria as a very cost-effective intervention is a great way to get people to start appreciating EA's attitude).

The problem I see is more in later stages of engagement with EA, when people already have a sense of what EA is but still get the impression (often unconsciously) that "if you really want to be part of EA then you need to work on one of the very specific EA cause areas".  

Thank you for your great feedback and suggestions! (and sorry for not responding sooner)

I guess that one’s meaning for a “major” or “moderate” limitation is, in the end, contingent on their aspirations. If we had the standards of an organization like GiveWell, this would most certainly be a very big limitation. But quite early on we understood that we did not have the data to be able to support as strong conclusions about cost-effectiveness as GiveWell’s recommendations. Rather, our approach was: let’s do the best we can with the data we have at hand, and ... (read more)

Thank you!

I agree it would be nicer to report actual spared animals, rather than generic “portions of meat”. We thought of using data about the average meat diet in the relevant countries, to be able to translate portions of meat into animal lives. But we eventually decided against it, because it would introduce even more assumptions and uncertainties into our analysis, which we felt had many uncertainties already. Given the amount of uncertainty that we already have (with over an order-of-magnitude between our lower and upper bounds), we felt that giving ... (read more)