Hide table of contents

Crossposted from Pawel’s blog

This essay has three parts. In part one, I include notes on epistemic status, I give a summary of the topic. But mainly I describe what is the Expert trap.

Part two is about context. In “Why is expert trap happening?” I dive deeper explaining biases and dynamics behind it. Then in “Expert trap in the wild” I try to point out where it appears in reality.

Part three is about “Ways out”. I list my main ideas of how to counteract the Expert trap. I end with conclusions and with a short Q&A.

Intro

Version: 0.3. This is still a very early version of this article. Yes, this thing has versions and will improve with time.

Feedback: I would love to understand what I am missing or learn about any counter-arguments. Feel free to write a comment, dm me, or share anonymous feedback here: sysiak.com/feedback.

Writing style: I don’t normal write. English is not my native language. But I care about being simple and precise. Read more about my writing approach and values here: sysiak.com/about

Epistemic status: I think it’s a great practice, to begin with Epistemic status. That is state your certainty, effort, link to sources, and point to main counterarguments. In this case, however, feel free to skip it and come back to it at the end. I am proposing here quite a large statement and I am writing with quite a lot of uncertainty. It may be interesting to first evaluate claims on your own. At the end of the series, I will remind you about coming back to epistemic status and I will post a short Q&A that will hopefully clarify some gaps.

Epistemic status (optional)

Certainty: If I were to pick one confidence interval for the main claim it would be Likely (70%). There are also parts of the described dynamics that I think are Highly likely (90%)or Almost certain (>97%). These are for example explanations about my-side bias, confirmation bias, and hindsight bias which I think are established and largely non-controversial. But I also have quite a lot of uncertainty. There are a lot of claims that are around Maybe (50%) (hovering between 35% and 75%). I think the most uncertain part of the knowledge is Hierarchy bias and ideas from “Elephant in the Brain”. This is a sweeping reinterpretation of how human motivations work. Personally, I think it’s Likely (~70%) that hierarchy bias largely explains our behavior. But I understand others will find it a lot less probable.

Effort: I have been exposed to these findings for five years now. Since then I had time to digest them and read thoroughly around the topic.

Evidence: The evidence mostly comes from these sources: “Elephant in the Brain”, Stumbling on Happiness”, Daniel Kahneman’s work. Please let me know if you know any information that invalidates any of the research mentioned.

Antithesis: What would need to be true to invalidate the main claims of this essay? Here are three main axes of criticism 1) I may be missing some major factor (besides hierarchy and my-side bias) of why the Expert trap is happening 2) Hierarchy bias findings are not established and therefore there is a chance they are incorrect or have some gaps 3) All systems loose efficiency with getting more complex. Perhaps, it is universal and unavoidable that the more complex the knowledge the more corrupted and inaccessible it gets. The current norms around learning, sharing, encoding knowledge are rather efficient.

Summary

Summary in one paragraph: The main claim of this essay is that knowledge often gets locked in specific areas and levels of expertise. I identify two primary factors contributing to this phenomenon: hierarchy bias and my-side bias. I also present methods for how to counteract it.

Summary longer: This article explores the intuition that the way our civilization encodes knowledge is often faulty, inefficient and makes that knowledge difficult to use.

This essay explains my take on a cognitive bias that I call Expert trap. Those who are better informed have trouble or are unable to pass knowledge on to those who are less informed. Some call this bias “The curse of knowledge”. I use expert trap because it’s shorter, and has a closer association with the root of the problem. I see the Expert trap as a larger phenomenon than what one typically associates with “The curse of knowledge”. I see it as driven by a couple of other biases. These are Hindsight bias – once you know the answer to the question you will think that you would have guessed it. Hierarchy bias – people's hidden motive to acquire knowledge may be less about getting things right than elevating themselves in a hierarchy. Confirmation bias – once you have formed an opinion you will tend to select and believe in information that strengthens it and less in information that challenges it. At the root of all these biases is My-side-bias –what is mine is better. Whichever definition is mine will be questioned less.

I think the Expert trap has large and overlooked consequences. I will propose a hypothesis suggesting that the learning and sharing of knowledge within our civilization is largely inefficient, with the Expert trap serving as the primary explanation. I will describe this using examples of the educational system and our approach to learning in general. Finally, I will explain methods that may be helpful in counteracting it.

What is the Expert trap?

Healthy knowledge

First, let’s define the opposite. What’s the healthy state of knowledge? Knowledge that is efficient, robust, and useful? The metaphor I like is of conductivity. High-quality knowledge is highly conductive.

  • It brings one smoothly from not knowing to knowing.
  • It enables one to easily access different levels of complexity. So if one wants to learn just a little bit one knows how to do it. A simple explanation should be a good mapping, representation, and stepping stone to a more complex one.
  • It also should be roughly correct at different levels of complexity. So if one chooses to stay at a lower level and decides to apply this knowledge to their area of expertise they are going to get approximately accurate results.

I think knowledge created by our civilizations is often of low conductivity. I think one of the main drivers of this is the Expert trap dynamic.

Hindsight bias

The phrase “the curse of knowledge” was first used in 1989 by Camerer and Loewenstein. They saw it as closely related to Hindsight bias – knowing the outcome makes people falsely confident that they would have predicted the answer.

“Study participants could not accurately reconstruct their previous, less knowledgeable states of mind, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge". Fischhoff, Baruch (2003). "Hindsight is not equal to foresight".

It is as if our brains are wishfully reconstructing the knowledge to fit the outcome. If a person knows the outcome, they may be less inquisitive about its root causes, less motivated to look at it from first principles. They may be looking less carefully at each part of the process and therefore bending inputs so they match the outcome.

Historically, hindsight bias was the first clue to understanding the curse of knowledge (or what I call the Expert trap dynamic). My hypothesis is that it likely is an extension of my-side bias, which is also referred to by others as motivated reasoning (I write more on this later). That is we may be motivated to be critical about the knowledge as long as it strengthens our positive self-image. When I know the answer to the question I am not motivated to really dig deeper into its root causes. I already correctly guessed it and the reward, such as I am smart, was already delivered.

Tapping experiment

When subjects were asked to finger-tap a popular tune of their choosing they were hugely overconfident about how many people would get it. They estimated that 50% of people would get it whereas in reality, 1.33% got it from the 1990 Stanford experiment.This may be the best metaphor for the Expert trap I stumbled upon. It clearly renders what’s going on in the mind of somebody who “knows”. It looks like a person who “knows” is projecting that knowing onto their audience and is unable to see what they are really communicating.

All experts look the same :)

Knowledge silos

Knowledge is often trapped in different expertise silos. It is trapped there in a state that is anywhere from – it’s hard to use to it’s unusable. I see two main mechanisms there.

When people learn, the knowledge gets trapped at each new level of understanding. Once a person acquired knowledge they often are unable to explain it to people who don’t understand. What’s fascinating here, is it seems that they wouldn’t be able to explain it to the past version of themselves. Somehow the context gets lost. Perhaps, as a person aspires to understand further, they lose track of the “Aha!” moments that brought them to where they are.

But also knowledge gets trapped across different disciplines. How easily can physicists talk to chemists about the same processes? I guess because of different terminology and mental models experts in adjacent areas often have a hard time. I will write more extensively on how this may work in “Why is expert trap happening?”

Expert trapped in Morgan Library

But let’s land in the real world. I have this silly example that I think renders well the dynamic of the Expert trap.

For two weeks I was living in Manhattan, New York, a ten-minute walk from the Morgan Library. I read that there is an interesting exhibition. I opened Google Maps and saw a photo of it. It just looked like an old library. I saw things like this before and decided not to go.

After a while, a different friend mentioned Morgan Library again. “Recommendations from two different sources? The exhibition is still going. Let’s go” I went there and was blown away. I explored it very carefully, digested it and sent a photo to my partner. I expected she will respond with excitement. But it seems it was “meh” to her. I looked at the photo and realized I made a very similar photo to the ones in Google Maps.

I think, like in Tapping experiment example, I projected my knowing on the photo. There were more things that make this place fascinating. The photo didn’t communicate them. Perhaps be that this private mansion, with the feel inside of a rural villa, is in the midst of Manhattan, one of the most densely populated places of the United States. Also the juxtaposition of wealth. This is the office of J.P. Morgan a founder of Chase, the biggest bank in the US. Here is this dirty street, and here, behind a wall, the exhibit of the most insane wealth in the world. A picture on the wall? You get closer. It’s Hans Memling. Some small altar? You zoom in and it is executed in thousand years ago in the Byzantine empire and is framing fragments of the cross of Jesus Christ.

I took the photo of this place and I have mentally overwritten it with new meanings. We may be victims of Expert Trap on many different layers and much more often than we assume.

Illusory knowledge

When learning, I think we very often fool ourselves about our comprehension level. We think we understand something, but what we actually did is familiarized ourselves with the area and memorized terminology. Memorized names, often, function as covers that are conveniently obstructing areas that are still fuzzy. Things start to feel familiar but we are not much deeper in understanding it. This seems like a pretty strong and redefining statement. I see this as a gradient. When a person learns something, they will acquire both illusory and true knowledge. I am claiming, however, that the proportion of illusory knowledge is much higher than it is conventionally assumed. Also, I don’t think people are doing it intentionally. Most of this happens subconsciously.

So when a person has a breath of complex terminology connected to some knowledge area, it is easy to mistake it for knowledge that is robust, precise, practical, flexible, and applicable to many contexts. Very few people have a habit to learn things comprehensively. That is, when asked to explain things, they can approach it from many different perspectives – explain it to a kid, a high-schooler, or an expert. Learning this way involves taking concepts and applying them to a wide variety of areas, classes, contexts; testing them against edge cases, counterfactuals; thinking about them in the most practical way. How this abstract concept intersects with the real world? How, if true, it will change what I see in the real world?

Arriving closer to true knowledge seems more like a curvy path, like a system of paths that are traveled in many directions. It may be more type of thinking that comes from play, and curiosity. Whereas illusory knowledge often can be found in thinking that is instrumental, that is a mean to something else, that tries, in a more straightforward way, to get to conclusions.

I sense this illusory knowledge is abundant. I think it's a primary way we encode knowledge. Our learning methods and our educational systems may be full of it. I think it may be largely present in any level of education: from primary school to higher education.

This could be one of the main reasons why most schools are experienced as boring, and it might also explain why I cannot remember almost any useful knowledge I learned during my primary and high school education.

 

Read Expert trap: Why is it happening? (Part 2 of 3)
 

Comments


No comments on this post yet.
Be the first to respond.
Curated and popular this week
 ·  · 4m read
 · 
SUMMARY:  ALLFED is launching an emergency appeal on the EA Forum due to a serious funding shortfall. Without new support, ALLFED will be forced to cut half our budget in the coming months, drastically reducing our capacity to help build global food system resilience for catastrophic scenarios like nuclear winter, a severe pandemic, or infrastructure breakdown. ALLFED is seeking $800,000 over the course of 2025 to sustain its team, continue policy-relevant research, and move forward with pilot projects that could save lives in a catastrophe. As funding priorities shift toward AI safety, we believe resilient food solutions remain a highly cost-effective way to protect the future. If you’re able to support or share this appeal, please visit allfed.info/donate. Donate to ALLFED FULL ARTICLE: I (David Denkenberger) am writing alongside two of my team-mates, as ALLFED’s co-founder, to ask for your support. This is the first time in Alliance to Feed the Earth in Disaster’s (ALLFED’s) 8 year existence that we have reached out on the EA Forum with a direct funding appeal outside of Marginal Funding Week/our annual updates. I am doing so because ALLFED’s funding situation is serious, and because so much of ALLFED’s progress to date has been made possible through the support, feedback, and collaboration of the EA community.  Read our funding appeal At ALLFED, we are deeply grateful to all our supporters, including the Survival and Flourishing Fund, which has provided the majority of our funding for years. At the end of 2024, we learned we would be receiving far less support than expected due to a shift in SFF’s strategic priorities toward AI safety. Without additional funding, ALLFED will need to shrink. I believe the marginal cost effectiveness for improving the future and saving lives of resilience is competitive with AI Safety, even if timelines are short, because of potential AI-induced catastrophes. That is why we are asking people to donate to this emergency appeal
 ·  · 17m read
 · 
TL;DR Exactly one year after receiving our seed funding upon completion of the Charity Entrepreneurship program, we (Miri and Evan) look back on our first year of operations, discuss our plans for the future, and launch our fundraising for our Year 2 budget. Family Planning could be one of the most cost-effective public health interventions available. Reducing unintended pregnancies lowers maternal mortality, decreases rates of unsafe abortions, and reduces maternal morbidity. Increasing the interval between births lowers under-five mortality. Allowing women to control their reproductive health leads to improved education and a significant increase in their income. Many excellent organisations have laid out the case for Family Planning, most recently GiveWell.[1] In many low and middle income countries, many women who want to delay or prevent their next pregnancy can not access contraceptives due to poor supply chains and high costs. Access to Medicines Initiative (AMI) was incubated by Ambitious Impact’s Charity Entrepreneurship Incubation Program in 2024 with the goal of increasing the availability of contraceptives and other essential medicines.[2] The Problem Maternal mortality is a serious problem in Nigeria. Globally, almost 28.5% of all maternal deaths occur in Nigeria. This is driven by Nigeria’s staggeringly high maternal mortality rate of 1,047 deaths per 100,000 live births, the third highest in the world. To illustrate the magnitude, for the U.K., this number is 8 deaths per 100,000 live births.   While there are many contributing factors, 29% of pregnancies in Nigeria are unintended. 6 out of 10 women of reproductive age in Nigeria have an unmet need for contraception, and fulfilling these needs would likely prevent almost 11,000 maternal deaths per year. Additionally, the Guttmacher Institute estimates that every dollar spent on contraceptive services beyond the current level would reduce the cost of pregnancy-related and newborn care by three do
 ·  · 1m read
 · 
Need help planning your career? Probably Good’s 1-1 advising service is back! After refining our approach and expanding our capacity, we’re excited to once again offer personal advising sessions to help people figure out how to build careers that are good for them and for the world. Our advising is open to people at all career stages who want to have a positive impact across a range of cause areas—whether you're early in your career, looking to make a transition, or facing uncertainty about your next steps. Some applicants come in with specific plans they want feedback on, while others are just beginning to explore what impactful careers could look like for them. Either way, we aim to provide useful guidance tailored to your situation. Learn more about our advising program and apply here. Also, if you know someone who might benefit from an advising call, we’d really appreciate you passing this along. Looking forward to hearing from those interested. Feel free to get in touch if you have any questions. Finally, we wanted to say a big thank you to 80,000 Hours for their help! The input that they gave us, both now and earlier in the process, was instrumental in shaping what our advising program will look like, and we really appreciate their support.