Introduction
You rarely have access to all the information that would be useful when making a decision. When I talk about 'unseen data', this kind of missing information is what I mean.
For now, let's keep the idea of 'data' as broad as possible, and say that it can include facts, scientific studies, but also life experiences, files on a server somewhere, an understanding of useful analytic processes to apply to a problem, among many other things.
As you have probably guessed, I would like to see EA attribute more value to this kind of unseen data. In particular, I would like to see EAs attribute more value to data they have not seen but that others might have.
I think this would be a good way both to increase EA's chances of identifying problem areas it might have missed, and to generally increase the tractability of priority EA problem areas.
How I am estimating EA's valuation of unseen data
I have a few hypothetical actions in mind that EAs can take (or not take) to demonstrate higher or lower valuations of unseen data. Taking more of these actions means a higher valuation and vice-versa:
Promotion of useful and undertaught decision-making tools. We can promote our values and suggest concrete actions, but we can also promote more general tools. An agent who is somewhat aligned with EA and who has data that lets them see a great opportunity others can't will likely do a better job of noticing it and executing on it if they understand basic statistics, for example.
Dedicated funding of speculative bets. I have something like Tyler Cohen's Emergent Ventures in mind. An EA organisation could try to get good at identifying exceptional individuals whose values overlap with EA and who have ideas for interventions outside of EA's recommended problem areas. Among other things this would act as a hedge on existing problem-area analysis.
Grassroots politics. Support the kinds of groups that Momentum might incubate. Diversity of experience means more unseen data, and activism is a place where otherwise underrepresented people are likely to show up. I suspect this is my most controversial suggestion as politics is the mind-killer, but done in a careful and evidence-based manner I think that like SBF donating to Biden it would not create a discourse problem.
How much I would like to see EA value unseen data
I don't know how to identify an optimal amount here, but I do feel confident about specifying an approximate lower bound of action that would move EA's valuation in the right direction.
I would like to see EA study examples of initiatives comparable to the ones above. For example, very good things happened when people in medicine became more scientifically minded—in what ways might this phenomenon generalize to or be reproducible within other industries? What can we learn so far from Emergent Ventures' outcomes, from IDInsight's endorsement of Sunrise Movement or the impact of historical protest movements?
This seems like a relatively low-cost bet with a plausible shot at uncovering excellent interventions.
One step up from this would be experimenting with low-cost forms of these kinds of interventions directly. Cambridge University runs an AGI Safety Fundamentals Course, what about something comparable for gifted teens on Bayesian statistics?
I see almost no interest in these sorts of initiatives at present, which I think represents an undervaluation. (And if I am simply not aware of existing work which fits these criteria, I look forward to learning about it in the comments!)
How this is different from outreach (which EA already does)
I first encountered effective altruism when I met 80K at a careers fair, and I have received 80K 1:1 coaching, which is an amazing service. I'm also recently aware of Peter McIntyre's https://non-trivial.org/.
These are all good initiatives, but I would also like to see EA trying to harness more people's unseen data before or even without trying to convince them of the EA worldview.
Final Thoughts
This problem has an inverse, which is overvaluing seen data, but if I end up writing on it I'll do so in a separate post. I hope I have encouraged you to consider your valuation of unseen data for now.
I would go on to say data shouldn't be used. Data is something collected after the fact, and it only measures what is measured, not what's important to measure. It's easy to accurately identify information in the present and in the future through priors and other means.
Data doesn't necessarily measure what's important to measure, so you need to be smart about harnessing data that is important to the problem you're solving. But to say that it never measures what's important to measure is straightforwardly false. For example, to believe that you'd have to write off all of modern science as 'unimportant'.
I agree. Data has different meanings and uses- priors are forms of data. Right now I see data primarily as a tool of persuasion. It's relevance varies across fields- data in psychology is very different from data in physical sciences. Like you mentioned, it's accuracy depends on the people creating and conducting the study. Modern science is dissatisfying to me, with persuasion being one of the problems I have with it. Even the commenting guidelines in this reply say "aim to explain, not persuade" While I would never write off all of modern science, one of the projects I'm working on is an alternative to academia. One of the goals is to use data in much more progressive ways then it is used now.
What I meant to say in my original comment that data as it is collected and used in present and recent times should not be used. While that would remove the use of the majority of data we have now, that doesn't mean there are not great amounts of relevant uncollected and unused data.