Suppose I have a cause I’m passionate about. For example, we’ll use fluoridated water. It’s poison. It lowers IQs. Changing this one thing is easy (just stop purposefully doing it) and has negative cost (it costs money to fluoridate water; stopping saves money) and huge benefits. That gives it a better cost to benefit ratio than any of EA’s current causes. I come to EA and suggest that fluoridated water should be the highest priority. Is there any organized process by which EA can evaluate these claims, compare them to other causes, and reach a rational conclusion about resource allocation to this cause? I fear there isn’t.

Do I just try to write some posts rallying people to the cause? And then maybe I’m right but bad at rallying people. Or maybe I’m wrong but good at rallying people. Or maybe I’m right and pretty good at rallying people, but someone else with a somewhat worse cause is somewhat better at rallying. I’m concerned that my ability to rally people to my cause is largely independent of the truth of my cause. Marketing isn’t truth seeking. Energy to keep writing more about the issue, when I already made points (that are compelling if true, and which no one has given a refutation of), is different than truth seeking.

Is there any reasonable on-boarding process to guide me to know how to get my cause taken seriously with specific, actionable steps? I don’t think so.

Is there any list of all evaluated causes, their importance, and the reasons? With ways to update the list based on new arguments or information, and ways to add new causes to the list? I don’t think so. How can I even know how important my cause is compared to others? There’s no reasonable, guided process that EA offers to let me figure that out.

Comparing causes often depends on some controversial ideas, so a good list would take that into account and give alternative cause evaluations based on different premises, or at least clearly specify the controversial premises it uses. Ways those premises can be productively debated are also important.

Note: I’m primarily interested in processes which are available to anyone (you don’t have to be famous or popular first, or have certain credentials given to you be a high status authority) and which can be done in one’s free time without having to get an EA-related job. (Let’s suppose I have 20 hours a week available to volunteer for working on this stuff, but I don’t want to change careers. I think that should be good enough.) Being popular, having credentials, or working at a specific job are all separate issues from being correct.

Also, based on a forum search, stopping water fluoridation has never been proposed as an EA cause, so hopefully it’s a fairly neutral example. But this appears to indicate a failure to do a broad, organized survey of possible causes before spending millions of dollars on some current causes, which seems bad. (It could also be related to the lack of any way good way to search EA-related information that isn’t on the forum.)

Do others think these meta issues about EA’s organization (or lack thereof) are important? If not, why? Isn’t it risky and inefficient to lack well-designed processes for doing commonly-needed, important tasks? If you just have a bunch of people doing things their own way, and then a bunch of other people reaching their own evaluations of the subset of information they looked at, that is going to result in a social hierarchy determining outcomes.

8

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 12:37 PM

You might be interested in:

As I see it, EA is not a single consensus, different individuals reach very different conclusions about resource allocation, as you can see e.g. in the current "Where are you donating this year, and why?" thread. Or by comparing Founders Pledge The Global Health and Development Fund Grants with GiveWell's All Grants Fund.
Also, it seems to me that there are many ideas that people are passionate about, but are often bottlenecked by a lack of implementers (i.e. people willing and able to turn those ideas into concrete projects).
When I see a successful new EA project, it never seems to happen because "EA" reached the conclusion that the project was important and allocated resources to it, but because some individuals developed a theory of change and worked to make it happen.

You might want to check out https://forum.effectivealtruism.org/s/AbrRsXM2PrCrPShuZ

pretty much agree that it doesn't seem optimal to have people trying to drum up hype with a blog post when they think there is an opportunity for high impact. It would be nice to have a site that has thousands of very modular forecasts/ impact estimates on things that you can paste together so that people can see the numbers clearly and quickly. 

 

I think this is sorta trying to do that on a less ambitious level.