In the 80.000Hours podcast episode with Ezra Klein (https://80000hours.org/podcast/episodes/ezra-klein-journalism-most-important-topics/#biggest-critiques-of-the-effective-altruism-and-rationalist-communities-012040), one of the critiques that Ezra gave of the rationalist/EA community, is that those rationalist people too often state their epistemic status or confidence levels when making claims. He said: "I appreciate that Scott Alexander and others will sometimes put ‘epistemic status, 60%’ on the top of 5,000 words of super aggressive argumentation, but is the effect of that epistemic status to make people like, “Oh, I should be careful with this,” or is it like, “This person is super rational and self-critical, and actually now I believe him totally”? And I’m not picking on Scott here. A lot of people do this. [..] And so people are just pulling like 20%, 30%, 70% probabilities out of thin air. That makes things sound more convincing, but I always think it’s at the danger of making people… Of actually it having the reverse effect that it should. Sometimes the language of probability reads to folks like well, you can really trust this person. And so instead of being skeptical, you’re less skeptical. So those are just pitfalls that I notice and it’s worth watching out for, as I have to do myself."
I agree that merely mentioning your confidence level (e.g. "I feel X% confident about Y") may be misleading and not so informative. But it got me thinking: instead of communicating confidence levels, what might be more fruitful, is people communicating their epistemic status shifts (e.g. "after thinking and reading about Y recently, I changed my confidence from X% to Z%") or confidence interval changes (e.g. "with this new evidence, I narrowed down my confidence interval from..."). This gives clearer information about how people update their beliefs, how strong or important they think the new evidence is (what Bayesian update factor they use), and what their prior beliefs or confidence levels were. It also better creates a culture where changing one's mind is considered a good thing, and where changing one's mind does not always mean making 180° turns, but also includes having smaller shifts in epistemic status.