New Comment
1 comment, sorted by Click to highlight new comments since: Today at 12:45 PM

Doing Doing Good Better Better

The general idea: We need more great books!

I've been trying to find people willing and able to write quality books and have had a hard time finding anyone. "Doing Doing Good Better Better" seems one of the highest-EV projects, and EA Funds (during my tenure) received basically no book proposals, as far as I remember. I'd love to help throw a lot of resources after an upcoming book project by someone competent who isn't established in the community yet.

The current canonical EA books—DGB, WWOTF, The Precipice—seem pretty good to me. But only very few people have undertaken serious attempts to write excellent EA books.

It generally seems to me that EA book projects have mainly been attempted by people (primarily men) who seem driven by prestige. I'm not sure if this is good—it means they're going to be especially motivated to do a great job, but it seems uncorrelated with writing skill, so we're probably missing out on a lot of talented writers.

In particular, I think there is still room for more broad, ambitious, canonical EA books, i.e. ones that can be given as a general EA introduction to a broad range of people, rather than a narrow treatment of e.g. niche areas in philosophy. I feel most excited about proposals that have the potential to become a canonical resource for getting talented people interested in rationality and EA.

Perhaps writing a book requires you to put yourself forward in a way that's uncomfortable for most people, leaving only prestige-driven authors actually pursuing book projects? If true, I think this is bad, and I want to encourage people who feel shy about putting themselves out there to attempt it. If you like, you could apply for a grant and partly leave it to the grantmakers to decide if it's a good idea.


A more specific take: Books as culture-building

What's the secret sauce of the EA community? I.e., what are the key ideas, skills, cultural aspects, and other properties of the EA community that have been most crucial for its success so far?

My guess is that much of EA is about building a culture where people care unusually strongly about the truth, and unusually strongly about having an impact.

Current EA introduction books only embody this spirit to a limited degree. They talk about a wide range of topics that seem non-central to EA thinking, they tend to start with a bottom line ("x-risk is 1 in 6 this century" / "longtermism should be a key moral priority") and argue for it, instead of the other way around. (Kind of. I think this is a bit unfair, but directionally true.) They tend to hide the source of knowledge; they're not especially reasoning-transparent. The LessWrong Sequences do this well, but they're lengthy and don't work for everyone. There's a much wider range of rationality-flavored content that could be written.

If we hand out books for talent outreach purposes, it would be great if EA books could also do the cultural onboarding at the same time, not just introducing the reader to new ideas, but also new reasoning styles. This could reduce community dilution and improve EA culture.

The main concern might be that such a book won't sell well. Maybe, I don't know? HPMOR seems one of the most widely  (the most widely?) read EA and rationality books.


Some specific brainstorming ideas

  • A book about the world's biggest problems, with a chapter for each plausibly important cause area, and some chapters on the philosophy (HT Damon / Fin? or someone else?)
  • A book that discusses  a particular cause/issue and uses that to exemplify rationalist/EA-style reasoning, gesturing at the broader potential of the  methodology.
  • A book that discusses historical and current deliberate attempts at large-scale-impact projects (e.g., Green Revolution, Pugwash Conferences, cage-free campaigns, 1DaySooner, …), both successes and failures (with lessons learnt), with a lot of insider war stories and anecdotes that allow you to understand how the sausage actually gets made.


Epistemic status: Have been thinking about this for about a year, spent 30 minutes writing it down, feel reasonably confident in the general idea but not the specifics. I wrote this quickly and may still make edits.