Your independent impression about something is essentially what you'd believe about that thing if you weren't updating your beliefs in light of peer disagreement - i.e., if you weren't taking into account your knowledge about what other people believe and how trustworthy their judgement seems on this topic. Your independent impression can take into account the reasons those people have for their beliefs (inasmuch as you know those reasons), but not the mere fact that they believe what they believe.
Meanwhile, your all-things-considered belief can (and probably should!) also take into account peer disagreement.
Armed with this concept, I try to stick to the following epistemic/discussion norms, and I think it's good for other people to do so as well:
- I try to keep track of my own independent impressions separately from my all-things-considered beliefs
- I try to feel comfortable reporting my own independent impression, even when I know it differs from the impressions of people with more expertise in a topic
- I try to be clear about whether, in a given moment, I'm reporting my independent impression or my all-things-considered belief
One rationale for that bundle of norms is to avoid information cascades.
In contrast, when I actually make decisions, I try to always make them based on my all-things-considered beliefs.
For example: My independent impression is that it's plausible that an unrecoverable dystopia is more likely than extinction and that we should prioritise such risks more than we currently do. But this opinion seems relatively uncommon among people who've thought a lot about existential risks. That observation pushes my all-things-considered belief somewhat away from my independent impression and towards what most of those people seem to think. And this all-things-considered belief is what guides my research and career decisions. But I think it's still useful for me to keep track of my independent impression and report it sometimes, or else communities I'm part of might end up with overly certain and homogenous beliefs.
This term, this concept, and these suggested norms aren't at all original to me - see in particular Naming beliefs, this comment, and several of the posts tagged Epistemic humility (especially this one). But I wanted a clear, concise description of this specific set of terms and norms so that I could link to it whenever I say I'm reporting my independent impression, ask someone for theirs, or ask someone whether an opinion they've given is their independent impression or their all-things-considered belief.
My thanks to Lukas Finnveden for suggesting I make this a top-level post (it was originally a shortform).
This work is licensed under a Creative Commons Attribution 4.0 International License.
A few arguments for letting your independent impression guide your research and career decisions instead:
IMO, good rules of thumb are:
I agree with your second and third arguments and your two rules of thumb. (And I thought about those second and third arguments when posting this and felt tempted to note them, but ultimately decided to not in order to keep this more concise and keep chugging with my other work. So I'm glad you raised them in your comment.)
I partially disagree with your first argument, for three main reasons:
(I wrote this comment quickly, and this is a big and complex topic where much more could be said. I really don't want readers to round this off as me saying something like "Everyone should just do what 80,000 Hours says without thinking or questioning it".)
Good points.
It's not enough to just track the uncertainty, you also have to have visibility into current resource allocation. The "defer if there's an incentive to do so" idea helps here, because if there's an incentive, that suggests someone with such visibility thinks there is an under-allocation.
I wander to what extent the maintenance of both views simultaneously is practical. Perhaps they would bleed into each other - where we may start taking on other people's beliefs for 'all-things-considered' and accepting them into our individual beliefs without truly questioning it and recognising it.
It sounds quite nice in theory, and I will try it to see what extent to which there is bleed over.
I found the OP helpful and thought it would have been improved by a more detailed discussion of how and why to integrate other people's views. If you update when you shouldn't - e.g. when you think you understand someone's reasons but are confident they're overlooking something - then we get information cascades/group think scenarios. By contrast, it seems far more sensible to defer to others if you have to make a decision, but don't have the time/ability/resources to get to the bottom of why you disagree. If my doctor tells me to take some medicine for some minor ailment, it doesn't seem worth me even trying to check if their reasoning was sound.
I like the words inside beliefs and outside beliefs, almost-but-not-quite analogous to inside- and outside-view reasoning. The actual distinction we want to capture is "which beliefs should we report in light of social-epistemological considerations" and "which beliefs should we use to make decisions to change the world".
Agreed that this topic warrants a wiki entry, so I proposed that yesterday just after making this post, and Pablo - our fast-moving wiki maestro - has already made such an entry!
I almost like inside beliefs and outside beliefs, but:
One final point: I think your last sentence could be read as implying that inside views are what we should report, in light of social-epistemological considerations. (Though I'm not sure if you actually meant that.) I think whether it's best to report an independent impression, an all-things-considered belief, or both will vary depending on the context. I'd mostly advocate for being willing to report either (rather than shying away from ever stating independent impressions) and being clear about which one is being reported.
On the social-epistemological point: Yes, it varies by context.
One thing I'd add is that I think it's hard to keep inside/outside (or independent and all-things-considered) beliefs separate for a long time. And your independent beliefs are almost certainly going to be influenced by peer evidence, and vice versa.
I think this means that if you are the kind of person whose main value to the community is sharing your opinions (rather than, say, being a fund manager), you should try to cultivate a habit of mostly attending to gears-level evidence and to some extent ignore testimonial evidence. This will make your own beliefs less personally usefwl for making decisions, but will make the opinions you share more valuable to the community.