238Vienna, AustriaJoined Jun 2020


It's already been removed from the final selection for various other reasons. Mostly redundancy of the values promoted, which are already covered in other works that will be on the list!

Thank you! And yes, excellent points. 

Haven't personally read Deutsch yet, does he reflect evolutionary thought well?
Both the systems books on the list (explicitly) and Taleb's book (implicitly) describe a lot of the evolutionary process notion. Are there major points lacking?

Thank you! Added Lean Startup on the current list and removed the two books. 

My personal stance on books is that it is to be expected that everyone has their strong points and their epistemic low points, Joy's book was very important to my own understanding and I personally easily felt like I could ignore the minor faults in relation to the main argumentation. Different readers have different preferences regarding the distribution of knowledge and error in a book, I'd personally be okay with this.

In consulting this is also referred to as the "Minto pyramid" or "pyramid principle of communication": you put the key message and main insight concisely as the first step.

In a fast-paced environment, this means the one piece with the highest information density is ensured to be delivered before e.g. getting interrupted. It also enables the listener to decide whether this topic is relevant early on in the communication and therefore shows respect for their attention.

After delivering the key message you can elaborate on the underlying insights or what brought you to the conclusion. If further elaboration is necessary or there is time for it the receiver of the message can ask further questions.

Here is an image to visualize the principle. The related book is called: "The Pyramid Principle: Logic in Writing and Thinking" by Barbara Minto.

Thank you to the many of you who have filled out the Google form as an alternative to writing a comment!

Here are some great points made:

  • the current average is at EA as a knowledge space being 75% principles/values and only 25% concrete knowledge


  • What We Owe The Future is heavily upvoted by those who already read it
  • Removing Factfulness from top line of books as there have been substantial critiques of New Optimism "people on both sides of the 'is the world getting better' debate can try to make the world better."
  • HPMOR should not be included in the introductory section "it teaches a certain mindset that fits inside Effective Altruism, but fails to introduce many important parts of the community", the style does not fit the values of a serious movement (person likes it a lot, though)


  • maybe replace 'The Moral Landscape' by Sam Harris with 'Think' by Simon Blackburn
  • Moral Tribes is extremely digestible for a philosophy book, much more so than Practical Ethics
  • Current ethics list is unfocused: "Parfit is an important ethicist and reasons and persons is important. But he isn't really more EA than many other ethicists. On Liberty is not really that EA-relevant. Utilitarianism is good. I like Hedonic Imperative but Pearce's writing style is a turn-off for many, I think. I'd strongly prefer collections of papers/articles than a list of full books. Reading full books is just a terrible strategy for getting a handle on important issues in ethics. If it has to be books, than I would use books that are collections of papers. Particularly: (1) The Oxford Handbook of Population Ethics. And (2) Greaves & Pummer's Effective Altruism: Philosophical Issues. (But I think you could do even better than these collections if you handpicked papers.) To these I would add (3) Mill's Utilitarianism (4) Singer's Expanding Circle. Strongly prefer a small list. Apologies that these thoughts are dashed off and unorganized."


  • redundant information, strongly prefer culling all but maybe one or two.
  • The Scout Mindset was fantastic, and of all the EA books I've read, it's probably the one I'm most inclined to recommend to pretty much everyone I know.
  • Once again, I just think it’s a bad idea to include all these books that are only tangentially related to EA, but are part of niche subcultures with their own worldviews. We’re not going to get a diverse community with fresh ideas if we filter for people who have a similar culture to current EAs.
  • this collection (The Handbook of Rationality) is a more information-dense path to learning this stuff I think. There are likely equally good subs


  • Add Nudge and Make It Stick
  • Freakonomics is not important to read. It's just some fun cases of applied economics. It's not an efficient learning tool and it isn't focused on important issues. It is entertaining. I don't think microeconomics is important because it helps with entrepreneurial decisions; these should maybe be considered separately. Quantitative economics is at least as important as micro. Prefer a culled list. Possibly culled to zero.
  • Black Swan contains some helpful stuff, but it is off-puttingly polemical and not focused on extinction risks. Would make sense in a very large library maybe.
  • Can't really see a justification for including Anarchy, State, and Utopia. I think this list is too long. I don't think EA has a unified or consistent political ideology for short-term nation-states; I think this is a good thing and don't want a list that implies that EA does have one.
  • Add Radical Markets by Posner and Weyl and Nudge


  • The Alignment Problem, while I would say is overall good, does jump around quite a bit narratively. I would want to read other books in this category before recommending it too strongly.
  • "Artificial Intelligence: A Modern Approach" is the only book I'm familiar with on the list that I dislike. The vast majority of people need something more accessible.
  • Animal Liberation was a great "why" book, The End of Animal Farming was a great "how" book. They serve different purposes. Overall, I found Animal Liberation more compelling than The End of Animal Farming, though it is quite a bit denser.

Community and Soft Skill:

  • How to Make Friends and Influence People


  • Daniel Haybron's work on happiness is the best I've come across by far: Happiness and Well-Being: Integrating Research Across the Disciplines; The Pursuit of Unhappiness: The Elusive Psychology of Well-Being (Oxford University Press, 2008); Happiness: A Very Short Introduction (Oxford University Press, 2013).


  • Haven't read ending aging but this seems like a preoccupation of the rationalist community that makes little sense by EA-lights? I'd prefer to get rid of this category.


  • Would be surprised if anything beyond what's included in the main EA recommendations is helpful here. Cull!
  • Biographies of changemakers seem like not particularly important reads?

Are there categories missing?

  • BioSecurity/Global Catastrophic Biological risks
  • I think you did an extremely thorough job. Well done!
  • Connections to other philosophies that value EA principles; ask e.g. Buddhist, Christian, and Jewish EA groups for recommendations

Should a category be removed?

  • I think people will assume the importance of an issue is proportional to the number of books included in it on the diagram. As a result, I would remove all object-level categories save the most important, and I would cull within them dramatically. I also think presenting object-level books seems like an endorsement, when mostly they are probably intended as jumping-off points for future thinking.
  • In general, I view MBA-style business strategy books as a negative signal. Of the philosophy books, Parfit is the only positive signal for me. The others mostly scan as popular philosophy.
  • I get a negative impression of people who are really into rationalist books and not much else—convinced of their own superiority, narrow-minded unless the idea is from a trusted rationalist guy, etc.

anything important missing?

  • Strangers Drowning, by Larissa MacFarquhar

Which books if understood by others would make you more confident in collaborating with them?

  • The Scout Mindset, Human Compatible, The Scout Mindset, anything introductory, Waking Up

I agree with your points made and I tried to explicitly address similar arguments in the original post!

The poster is meant to exactly distill the "canon" or "current orthodoxy" by aggregating what people currently and historically so far found most important to read. It is explicitly meant to distill this to quickly get a meta-knowledge of what that entails for various reasons: getting a quick introduction to precisely this meta-knowledge and the content but also to have a representation on which to build, precisely that, namely one's criticism of what is missing e.g. how about making a "10 knowledge spheres and their books, from which EA could gain a lot from" as a co-creational poster answer. Then more people could read in those directions and the next poster in a couple of years could entail exactly more of the new books many found crucial. The post also explicitly addresses that it is not meant to encourage anyone to read all of them.

This project is not centralized at all btw. I'm one member of the community making a draft to ask other community members what they think about it. It's an open conversation.

By having such a representation you can also more quickly understand which book is actually a "venturing out" into new territory versus just one's lack of knowledge of how central it already is in the community. I have seen that many times that someone would read a book and feel like they found something completely new and important, while I already knew that many have read and thought through exactly the same literature already.

Think "introductory textbook" into a field as an analogy. It's difficult to make an argument that they shouldn't exist because they don't already in-depth contain all the other options and criticisms of the stances. The metaphor often used for this problem is that of Wittgenstein's ladder: "My propositions serve as elucidations in the following way: anyone who understands me eventually recognizes them as nonsensical, when he has used them—as steps—to climb beyond them. (He must, so to speak, throw away the ladder after he has climbed up it.) He must transcend these propositions, and then he will see the world aright."

With my current research together with John Vervaeke and Johannes Jaeger, I'm continuing the work on the cognitive science of rationality under uncertainty, bringing together the axiomatic approach (on which Stanovich et al. build) and the ecological approach. 

Here I talk about Rationality and Cognitive Science on the ClearerThinking Podcast. Here is a YouTube conversation between me and John, explaining our work and the "The paradigm shift in rationality". Here is the preprint of the same argumentation as "Rationality and Relevance Realization". John also mentions our research multiple times on the Jim Rutt Show.

I've always admired your writings on the topic and you were one of the voices that led me to my current path.

First of all, thank you for speaking up about this. I know very smart people that are scared to just share their perspective on things and I do think THAT is very dumb.

Secondly, I do think donating some money regularly and cost-effectively is a safe bet, and freaking yourself out about "doing more" or even "the most" can easily be counterproductive. Just e.g. focusing on doing advocacy and explaining why evidence-based and cost-effective donations are good choices is still neglected in basically every country. There are many such relatively easy tasks that are great leverage points and in the end, it is precisely about comparative advantage. By you taking up such tasks you shoulder some burdens that are of relatively lower value to others.

Then for objectively difficult problems to solve it is, of course, reasonable to not try to make it "inclusive", there is a reason why there is a minimum height to become a soldier because the task environment will not change to accommodate certain people. I understand that you understand this. And by understanding this and e.g. not attempting something grandiose that ends up harmful, you are counterfactually already winning.

Then I also do think that "higher" intellectual ability and related work are not necessarily higher utility. There isn't one best or optimal thing everyone should be doing. The more one reads about complexity and systems science it is quite clear that there is no one optimal thing to do. It also shows that localism (serving one's direct community) e.g. is better than often portrayed in EA. Creatively and pragmatically solving problems you perceive directly around you is fantastic and your interest in EA suggests that might be better suited to doing so than others around you.

In general, you can be and become a virtuous person independently of your raw processing powers or academic credentials, and action on all possible levels is needed. 

I noticed that I missed Elephant in the Brain, it's a remarkable book and I agree with including it in the next version.
Hidden Games I had not thought of at all.
Thank you!

Load More