|This is a Draft Amnesty Day post. I wrote it in 2020 and ~haven't looked at it since. I'm posting it as-is.|
An invisible project is one of our most important — let’s try to reveal it
This community is about doing the most good. We have many conversations about how to do that. In the course of those conversations, we've slowly pushed forward a cultural understanding of "how do we form correct beliefs?" We call that understanding “epistemics.”
I want to make a few, hopefully-useful observations around this general space. If you haven’t read much about epistemics before, I hope it serves as an accessible introduction. If you’re an old hand, I hope it communicates this frame I’ve found useful.
I Why is this hard?
Most of the time, figuring out what's true is easy. When was this bridge built? Look it up on Wikipedia (more on that later). When it's not easy, it's often best to just use the tools someone else has already developed.
The situations where you really start needing to attack the problem are when:
- The tools you're used to are inadequate for the task at hand, or
- You and a collaborator disagree on how to figure out what's true.
Once that happens, I claim most people just get really confused. It's like, what the heck is going on? This makes no sense / my collaborator makes no sense. People give up, or can form really deep impasses with those around them. Even if you're fortunate enough to notice the issue for what it is, it can seem really hard to resolve. I think this is what happened when a lot of my friends and I were only half-convinced of AI risk.
How the f**k are you supposed to weigh "we literally have an RCT here" versus, "this other thing would be big if true"? Many people find the answer obvious, but unfortunately not the same way. I hope at least some point in your life you’ve viewed it as a hard problem.
Then, just when you and your best friend have figured out how to weigh evidence between yourselves, there comes a whole lot of other people. Many new complications arise when this is done as a community:
- Not all the participants are able to get complete information, or evaluate all the arguments
- Some participants are probably smarter than others
- Some participants are probably acting adversarially, or with some level of own-view favoring-bias
II All project-oriented communities do this
Maybe you’ve heard people talk recently about epistemics, and it’s felt like a fuzzy concept. I hope that presenting the ways in which a bunch of different communities go about forming beliefs.
As we’ve already mentioned, Wikipedia has some outstanding epistemics. What’s really useful for us here is that they’ve written it down. You can see the way that it's meant for Wikipedia’s particular situation and how it needs to be legible to outsiders and extremely resilient to adversarial action.
You can observe humanity making a huge leap forward by improving its epistemics in one important domain. Our species went from just being completely wrong about just about everything in the natural world, to methodically making progress in our understanding. That progress has compounded over time to completely transform the world. It's interesting the note that wasn't obvious with those epistemics should be at first. Are thought experiments valid scientific evidence? And in the soft sciences the epistemics are still controversial.
Some might claim that science has figured it all out. But what’s the scientific way to predict who you should pick to lead your company? To make most decisions that humans make there is simply too little high quality data.
Perhaps even more so than science, medicine is extremely conservative in its epistemics. There are so many actors involved who would love to take desperate people’s money, and the medical system has drawn a clear bright line between what’s well understood to work, and what’s not. But it might be too conservative in quickly changing situations. People from different epistemic cultures were extremely frustrated by health authorities' slowness in promoting mask usage during the pandemic for example.
News companies are allegedly in the business about helping me learn things that are true. My impression is that they have standards for what sources and origins of information are considered reliable enough for print. They also have historically done their best to differentiate between reporting facts about the world and offering interpretation. For facts I only need to trust that the journalist is honest, whereas for interpretations, I have to trust their judgement.
Social justice activists
You can view one of the central tenants of this culture as an epistemic position. They heavily prioritize personal experience as a strong predictor about whether someone has true beliefs about the topic under discussion.
My impression is that business tends to value truth statements that can be measured. Unlike in science, the metrics do not need to be super-strongly validated. If I had to guess, it would be that this provides a coordination mechanism that is objective. It might be off the mark, but if everyone is pulling in the same direction, and the metric is at least somewhat reasonable, they should make progress towards their true goal.
III Why it matters
Thinking about epistemics has made me notice the occasions when I make arguments that are in support of my side but that only work because I’m incorporating evidence that I don’t think I should, or excluding evidence that I’d otherwise endorse including. It’s hard to do this project well. But having beliefs that converge on the truth is completely critical for the goals we’d like to achieve. Not all of the possible epistemic strategies that we can adopt will do equally well at this goal.
In a world full of crucial considerations, we need all the help we can get not to miss any.