Hide table of contents

Say someone already spent 10-20 hours acquiring basic EA knowledge. Maybe they read Doing Good Better and the 80000 Hours career guide, maybe listened to a few 80,000 Hours podcasts. Is learning more about EA (eg, reading this forum) helpful to them?

Here are some guesses for various roles I brainstormed:

  • Safety engineer at OpenAI/DeepMind. You should probably know a few high-level things about AI Safety (maybe some about longtermism as a whole), but beyond that, ML, software engineering, and general productivity skills seem to matter much more.
  • Earning to Give. Unless someone really likes and is really good at thinking through and applying these concepts, there doesn't seem to be much value in learning more about EA, since donating to the EA Funds of your chosen cause area (maybe the donor lottery) is probably higher expected value than trying to pick out donation opportunities yourself.
  • EA Career coach. Having broad knowledge of EA seems valuable.
  • Animal rights activist. You should probably have some broad knowledge of expected value and what interventions work in the effective animal activism literature, but presumably most of your learning time is better spent networking/learning from other activists.
  • Developmental econ researcher. Maybe EA can help you prioritize research questions, but mostly I just don't see the added value of learning about EA relative to normal dev econ tools?
  • Cause prioritization researcher. Having broad and deep knowledge of EA seems very valuable.
  • Community builder. Probably a good idea to have a broad knowledge of both community building models and for individual cause areas (so you can help advise members).
  • American Politics/Policy practitioner. Doesn't seem like EA at the moment adds much beyond normal skills in the policy toolkit. Might be helpful to network with EAs so things in the grapevine can eventually reach you however.
  • AI Policy researcher. EA tools seem valuable (low confidence).
  • Grantmaker in EA-heavy field. Having broad and deep knowledge of EA seems very valuable.
  • Journalist. Having a broad knowledge of EA seems valuable.

Naively, it looks like most roles that individual EAs could be in does not, at this moment, benefit from substantial EA knowledge. So for most of us, the main benefit of learning more about EA is something more nebulous like entertainment, "unknown unknowns", or "feeling more connected to the community." Am I missing something major?


42

0
0

Reactions

0
0
New Answer
New Comment

7 Answers sorted by

Yeah, I think you're missing the flow-through effects of contributing to the EA hivemind. There's substantial value in having a large number of reasonably well-informed EAs thinking about and discussing EA ideas. It's possible this is insignificant compared to the contributions of a handful of prominent full-time thinkers, but seems like an important and nontrivial question in its own right.

+1 to this. I'd say I've spent anywhere from 70-150 hours consuming EA content, whether that's through events, books, podcasts, or videos. That's mainly because I'm a community builder and I'm earning to give, and so I wanted to find out a lot about the different causes and concepts within EA.

However, even for non community builders, I'd still recommend them to learn more about EA.

There starts to be diminishing returns past 50 hours of learning maybe, but if you're someone who wants to help convince other people to be EA's, or want to better understand and explain what EA is, you'll need about 30-50 hours of immersion.

I think that continually reading the EA Forum and understanding EA concepts more in-depth helps me convey to my network various perspectives and concepts that are EA-related.

For example, I can point people to resources on climate change, AI Safety, global health, and cause prioritization because of reading widely about EA and its causes. And I think more people should be aiming to do that.

Having a wide understanding of EA allows you to spread the knowledge more easily, and I believe there's a lot of value to spreading EA knowledge.

Agreed with this. Some positive things that can come from this:

  • cross-pollination of ideas, especially across cause area and academic discipline
  • relatedly, preventing ideas from becoming too niche/insular - allowing ideas to be accessible from different view points within the community.
  • encouraging people to explore ideas they find off-putting in more depth, which might ultimately change their actions in the long-term. For example, many people come to longtermism after many years in the community.
  • reinforcing norms of critically engaging with ideas, learning, keeping an open mind and so on.

To answer my own question, I suspect a lot of this comes from EA still being young and relatively pre-paradigmatic. A lot of valuable careers and projects seem to be in the timescale of not decades but rather years or even months, so keeping very up-to-date with what the hivemind is thinking, plus interfacing with your existing plans/career capital/network, allows you to spot new opportunities for valuable projects to do that you otherwise may not have even considered.

I suspect that as EA matures and formalizes, the value of following current EA thinking becomes less and less fruitful for the typical EA, and engagement (after some possibly significant initial investment) will look more like "read a newsletter once in a while", and "have a few deep conversations a year", even for very serious and dedicated EAs with some free time.

I think the important questions are "what type of information?" and "what would they be doing otherwise?"

If you're interested in development economics and the choice is between reading everything posted on the Forum and taking a micromasters course in data and development*, I'd suggest the latter. But if it's a choice between scrolling Facebook and learning about EA, learning about EA is probably more useful.

*https://micromasters.mit.edu/dedp/

Then again, it might be better to use leisure time for genuine leisure (reading your favourite novel, playing with a kitten) and use your learning time for the most useful thing you can learn.

I do worry that people "learn about EA" because it's "learning" that lets them feel good about themselves and that others will praise them for, but it's easier than learning something they actually need to make the world a better place. If I made an objective list of what I need to learn to help people, EA Global videos and Forum posts wouldn't be in the top 10, but here I am...

Hmm, different people vary a lot on what they find effortful, but I'm guessing a reasonable substitute for Facebook and the EA Forum for someone interested in development economics isn't doing an online degree, but probably something like following (other?) developmental econ academics or practitioners on Twitter.

If you knew more about EA, this would allow you to make decisions that you believe to be better informed or aligned with your values. How do you decide where to donate and what to do with your career? Not just which charities within a cause (you could trust the grantmakers for various funds for that, but you might disagree with them, too, if you knew more), but which causes, too?

You might disagree with the values or arguments that lead to certain recommendations, and find others more compelling, but you need to be aware of them first.

Quick take:

I think that in theory, if things were being done quite well and we had a lot of resources, we should be in a situation where most EAs really don't need much outside of maybe 20-200 hours of EA-specific information; after which focusing more on productivity and career-specific skills would result in greater gains.

Right now things are more messy. There's no great one textbook, and the theory is very much still in development. As such, it probably does require spending more time, but I'm not sure how much more time.

I don't know if you consider these "EA" concepts, but I do have a soft spot for many things that have somewhat come out of this community but aren't specific to EA. These are more things I really wish everyone knew, and they could take some time to learn. Some ideas here include:

  • "Good" epistemics (This is vague, but the area is complicated)
  • Bayesian reasoning
  • Emotional maturity
  • Applied Stoicism (very similar to managing one's own emotions well)
  • Cost-benefit analyses and related thinking
  • Pragmatic online etiquette

If we were in a culture that was firmly attached to beliefs around the human-sacrificing god Zordotron, I would think that education to carefully remove both the belief and many of the practices that are caused by that belief, would be quite useful, but also quite difficult. Doing so may be decently orthogonal to learning about EA, but would seem like generally a good thing.

I believe that common culture taught in schools and media is probably not quite as bizarre, but definitely substantially incorrect in ways that are incredibly difficult to rectify.

Thanks Linch for the post!


A comment is that there are things that one probably doesn't encounter in the first 10-20 hours that can be hugely useful (at least for me) in thinking about EA (both general and domain specific), e.g. this. (Perhaps that means things like that should work their way into key intro materials...)


In general I wish there were a better compilation of EA materials from intro to advanced levels. For intro materials, perhaps this is good. Beyond that, there are good content from

  • 80,000 Hours career guides, problem profiles, and blog posts (some being domain specific, e.g. AI safety syllabus -- not sure if such things exist for other cause areas)
  • Selected blog posts from EA orgs like GiveWell and Open Phil (there are many, but some are more meta and of general interest to EA, e.g. GiveWell blog post I mentioned above)
  • Selected blog posts from individual EAs or EA-adjacent people
  • Selected EA forum and Facebook group posts (there are too many, but perhaps the ones winning the EA forum prize are a good starting point)
  • EA Newsletter
  • David Nash's monthly summaries of EA-related content (here is one)

It would be great if there exists one (for general EA as well as specific topics / cause areas). It should probably be a living document being updated. It should ideally prioritize -- going down some order of importance so people with limited time could work their way through. Of course, selection is inherently subjective.

Perhaps the best way is to refer people to EA forum, newsletter, various blogs etc. But it seems nice to have a list of good articles from the past. Someone could work their way through it e.g. during their commute.

(Really not sure about the marginal value of this. Just thought of it as I keep seeing older posts, which are quite interesting, being referred to in EA forum posts; perhaps if a post were interesting enough I would come across someone citing it sometime, but there are definitely things I felt were pretty interesting and I could have missed. I'm not confident about the value, but worth thinking about perhaps part of our movement building work. Even partial work on this could be valuable -- doing the first "20%" that has "80%" value, metaphorically.)

My initial thoughts are similar to those of Adam and Linch, but I'll post them here anyway, despite possible redundancy:

  • If someone learns about an EA concept in detail, they might be able to use that to generate unique insights or follow-up, even if many other people already knew a lot about that concept. Reasons this might happen:
    • They are more up-to-date on the concept than others who learned about it a while ago (e.g. they've read very recent studies related to the concept)
    • They bring unique "outside knowledge" to the concept (e.g. they have a background in sociology, which none of the other people who knew about the concept had)
    • They have the time/skill to write out what they know in a way that a much wider audience can understand without much effort, which wasn't the case for the others
    • They have access to funding/other resources that the others didn't have (e.g. they can pay for a massive survey to gather direct data that furthers their knowledge, or they're in touch with experts in the relevant field who can help to deepen their knowledge)
  • Learning about concepts in detail seems important if someone might pursue a career in one of several fields. I'd want them to make such a decision based on a lot of firsthand information about each field, in addition to whatever shared resources the EA community can contribute (as personal characteristics and background play a big role in how well someone's career will go in a given field)
  • Someone who learns about a concept in detail might wind up explaining it (in less detail) to people they know personally who might otherwise never learn about it. Even if the community provides many "authoritative" sources of written knowledge, it's still hard to replace the benefit of having someone you know explain something to you and answer your questions in real time.
    • Similarly, they might be able to persuade other people with their combination of knowledge + personal connection. Many members of this community are only here because someone they knew persuaded them to get involved, and I'm guessing that "go read this article" is less persuasive than "let me tell you about this exciting thing that I'm clearly well-informed about"
  • Learning about things in detail builds good epistemic habits related to understanding how research works and so on. The more people in EA have good epistemic habits, the less likely it is that we'll miss something obvious or be vulnerable to bad arguments, scams, etc.
More from Linch
Curated and popular this week
Relevant opportunities