Yeah, there might be no end to how much you can understand about EA (or, more generally, stuff about the world that's relevant to altruism).
I certainly have my own blindspots but when talking to many other EAs I do notice that there a lot of topics they seem unfamiliar with:
- The extent of the uncertainty we have about the philosophy of mind
- Chances of being in a simulation/percentage of copies in a simulation and how that affects the expected value of various actions
- Philosophy of science/core assumptions behind how one thinks the world works
- Views of people in other parts of the world/society.
- Relatedly, how futures led by influential people in other countries might compare to futures led by influential people in their own countries.
- Reasons to think that there are quite a lot of things that are low tractability
- Noticing how they came to be who they are/their place in history
- Impact of various activities on wild animal suffering
I don't claim to know everything about the above. And of course, others who know more about other things might notice that there are a lot topics I'm unfamiliar with. Some topics I haven't really thought about that much relative to a lot of people working at EA organizations:
- Anthropics
- Alien civilizations, how that affects priorities in longtermism
- Ethical theories that are super formalized (my moral anti-realism doesn't make me that motivated to look into them)
- Acausal interactions and decision theory
Sorry for my inaccurate words. It's impossible to 100% understand each subjects(such as math, physics...) But how much time does it take for "enough understanding" in every cause area?