This is a crosspost for The most basic rationality techniques are often neglected and Rationality and discipline by Stefan Schubert, published on 30 August and 15 September 2021.
The most basic rationality techniques are often neglected and Rationality and discipline
How much can we "debias" ourselves? What should we do to become more rational? When discussing those issues, people usually focus on sophisticated debiasing techniques (“pre-mortem”) and advanced concepts from epistemology and statistics. What's often forgotten is that we already have a bunch of very simple but effective techniques for improving our rationality, such as (cf. Grice’s maxims):
“Don’t believe things for which you lack evidence. And don’t say such things in discussions.”
“Don’t make irrelevant personal attacks.”
“Don’t attack straw men."
“Define your terms clearly.”
“Make one point at a time.”
“Try not to be too emotional when thinking about heated issues.”
It seems to me that irrationality regarding moral and political issues (arguably the most important form of irrationality) is very often due to failure to apply these very simple techniques. That was certainly my experience when I argument-checked opinion pieces and election debates. Most fallacies I identified were extremely basic and boring (cf. my post Frequent fallacies). Maybe the most common was failure to provide evidence for claims that need evidence.
So maybe what we need to do to make people more rational isn’t primarily to teach them sophisticated debiasing techniques and advanced concepts. They are costly to learn, and most people have other, more pressing things to attend to. People who suggest new interventions and social reforms often neglect such time and attention costs. One might also suspect that people focus on the more sophisticated rationality techniques partly because they find them more interesting to think about than the basic and boring ones.
Instead, maybe we should focus on getting people to apply the most basic techniques consistently. Some of the sophisticated techniques are no doubt useful, but I'm not sure the primary focus should be on them.
To make people actually use these basic techniques, what's needed is strong social norms, saying that you shouldn’t believe or say things you lack evidence for, that you should define your terms clearly, etc. The strength of such norms have varied a lot over the course of history - and still varies today across different contexts. And my sense is that people’s actual rationality by and large reflects the strength of those rationality norms. So these norms can be pushed more or less, and I would guess that they are not yet pushed as much as they realistically could be pushed. Still, it’s obviously a difficult task, and I’m unsure about how to best go about it.
(This post was first posted on Facebook, 3 February 2020. Slightly revised.)
Rationality and discipline
Rationality has many aspects. It seems to me that the rationalist community often focuses on the fun bits, such as self-improvement, musings on one's own thought-processes, and speculative theorising (though no doubt there are important exceptions). What then gets a bit lost is that rationality is to a large extent about discipline, restraint, and rigour: things that aren't necessarily fun for most people. This is maybe natural given that the community is at least partly built around an intrinsic interest in rationality - they normally don't provide strong extrinsic incentives (e.g. degrees, money) to students of rationality. Nevertheless, I think a stronger emphasis on these less intrinsically appealing aspects of rationality is important.
This is an interesting article! I understand the main claim as follows:
An additional claim is that we typically focus on the "fun" parts of rationality, like self-improvement, instead of the simple but important aspects because they are less enjoyable. For example, discipline and restraint are harder to practice than self-improvement.
I assume this extra claim refers to the rationality community or the EA community.
So, the main point is essentially that rationality is mundane and simple (though not easy!), and we shouldn't try to make it more complex than it really is. This perspective is quite refreshing, and I’ve had some similar thoughts!
However, I’m concerned that, even though people might know about these techniques, the emotionally charged nature of political and moral topics can make it difficult to apply them. It’s not necessarily the other way around. Also, while I’m not sure if you would label these as complex or not, sometimes it takes time to figure out what you actually want in life, and this requires "complex" techniques.
Thanks for the comment, Mikolaj! Your points make sense to me.
There is interesting connection between those techniques and "Trapped priors" and the whole take on human cognition as bayesian reasoning and biases as a strong prior. Why would those techniques work? (Assuming they work).
I guess some like "Try to speak truth" can make you consider a wide range of connected notions e.g. you say something like "Climate change is fake" and you start to consider "why would make it true?" Or you just feel (because of your prioir) that this is true and ignore any further considerations (in that case the technique doesn't work).
Thanks for sharing that related post, Mikolaj.