In EA, there appears to be an interest in "good judgment," sometimes also called "rationality."
There is also interest in forecasting.
My question is, what are the concrete, operationalized differences between skill at forecasting vs having good judgment?
I'm not asking this question facetiously. For example, the parent company/organization of Superforecasting brands itself as the "Good Judgment Project."
But at the same time, when I think about "being good at forecasting" and "having good judgment," I often think of many different qualities. So how can we cleanly separate the two?
From the post you refer to:
Calibrated estimates for future events is the goal of forecasting, and while model-building and knowledge are valuable for this, I think they're valuable in other ways, too. I think another component of good judgement is being able to judge which problems to work on in the first place and how much effort and resources to put into them, falling under instrumental rationality. You need to decide which problems to apply your forecasting skills to, and I don't think this is a forecasting problem.
Also, my understanding is that forecasting is specific to predicting possible future events, and would not include having reasonable views on fundamental research questions, e.g. about consciousness, in physics, in normative ethics, etc..
(I suppose you could try to forecast the answers of experts or even hypothetical experts for fundamental research questions, but experts can be wrong, and this seems like a pretty unusual application and ad hoc way to get at fundamental research questions.)