Why are people so bad at reasoning? For the same reason they’re so bad at letting poisonous spiders walk all over their face without freaking out. Both “skills” are really bad ideas, most of the people who tried them died in the process, so evolution removed those genes from the population, and successful cultures stigmatized them enough to give people an internalized fear of even trying.
I'm really glad that I read this, and to be honest, a little disturbed by it. I was left with the sense that it was important knowledge that seemed to be undervalued / wasn't something I'd been previously exposed to.
Summary of why this is a worthwhile read for people interested in EA:
- EA involves or at least seems intertwined with heavy use of rationality to best identify the problems to solve and to best solve them.
- This post by Scott Alexander presents a compelling case of some of the downsides of rationality.
- It also presents a case for cultural evolution being a, or the, key force for human progress.
- In the pursuit of doing as much good as possible, with the assistance of rationality, it seems useful for EAs and the EA community to have an understanding of the historical challenges with rationality, as well as the importance of cultural evolution to human progress over the long-term future.
For what it's worth, I still have lots of open questions. But it seems like this book, and the review, both contain potentially important and under-discussed ideas.
Thanks for posting this. Posts introducing books or other bodies of work not explicitly about EA or an EA cause area, but that introduce or explain relevant ideas from disparate disciplines, seem valuable and I would like to see more.
See also this follow-up for extended quotes:
For a different take on the consequences of being "rational", I would highly recommend James C. Scott's book Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. The book summary of SSC is pretty good, but when he gives his opinion on the book he seems to have missed the point of the book entirely.
What do you think is the point of the book that SSC missed?
It is most apparent in this piece of the review:
He remains focused on the expected crops per acre, even though every case study in the book illustrates that such a single variable doesn't encompass the multitude of uses that the acre in question has. I don't think I could describe it better than Reddit user u/TheHiveMindSpeaketh does:
I personally think this is an important question for EA's to grapple with: can we reason abstractly about doing good without this abstraction causing mistakes at the level of what to value. Scott's technocrats surely did not think they were making that mistake, but they were. If we believe that we are somehow different, that is kind of arrogant.