Disclaimer: I'm no expert on any of this-- 18 years old. I have utmost respect for EA and much faith in the movement's ideas, there are just some questions weighing on my mind.
I'm not sure if 'interaction effect' is actually the correct phrase to use but I'll explain what I mean anyway.
When discussing an interaction effect in the context of EA, I'm describing how the actions/career paths with most expected impact are dependent on the existence of choices with less estimated impact. From my (fairly surface level) exploration of EA literature so far, including much of 80,000 Hours, I've gathered that this concept is overlooked by the community thus far, but I'm very open to being shown otherwise.
I'll illustrate what I'm calling the interaction effect with an example. Let's say someone goes into strategic AI research at the Future of Humanity Institute because this is proposed to be one of the most impactful career paths there is. In aiming for that career this person relied on the labour of several teachers. When the researcher is sick, they rely of the labour of doctors. They need to eat and so rely on the labour of people working in supermarkets. They sleep on a bed that only could have been bought through the labour of people working at a bed shop. You get my point. Every aspect of this AI research role is inextricable from other components of society.
My question is, if different roles of society are interdependent on each other, even when seemingly disconnected, then how can they rationally be ranked? How can a set of careers be deemed most important when they literally couldn't exist without (most of) all the other careers out there?
A response I can imagine reading is that even if most jobs are equally important by virtue of their interdependence, how skewed the ratio between how needed the work is vs how many people are actually working on it is a measure for priority. Like sure, entertainment is an important form of escapism in hard times and an agreeable source of information, but maybe there are too many artists and not enough people working in, say, biosecurity?
Anyway, hopefully if nothing else this could stimulate some discussion on a topic I've not seen addressed in EA. Cheers
I think the other responses capture the most important response to your question, which is that we tend to look at the value of things on the margin. However, as you're clearly thinking intelligently about important ideas, I thought I'd point you in the direction of some further thinking.
Another, perhaps clearer case where this "thinking on the margin" happens is with charity evaluation. If, for example, there existed some very rare and fatal disease which cost only pennies to cure, it would be extremely cost effective for people to donate to an organisation providing cures, until that organisation had enough to cure everyone with the disease. After this point, the cost effectiveness of additional funding would dramatically drop. Usually this doesn't happen quite so dramatically, but it's still an important effect. It is this sort of reasoning which has prompted givewell, for example, to look at "room for additional funding", see here.
There's another way of looking at your question though, which is to re-phrase it as "how should we assign credit for good outcomes which required multiple actors?"
One approach to answering this version of the question is discussed in depth here. I think you may enjoy it.