Metaculus has currently got over 1000 open forecasting questions, many of which are longtermist or EA focused.
These include several EA-focused categories, e.g. EA survey 2025, an Alt-Protein Tournament, Animal Welfare, the "Ragnorak" global catastrophic risks series, and other questions on the distant future.
I am volunteering at Rethink Priorities doing forecasting research, and am looking to see if there are EA related questions with long time horizons (>5 years) people are interested in seeing predictions on, and if there are I am willing to put some time into operationalising them and submitting them to Metaculus.
I think this would be both directly useful for those who have these questions and others who find them interesting, and also useful for expanding the database of such questions we have for the purpose of improving long term forecasting.

This question is part of a project of Rethink Priorities.
It was written by Charles Dillon, a volunteer for Rethink Priorities. Thanks to Linch Zhang for advising on the question. If you like our work, please consider subscribing to our newsletter. You can see all our work to date here.
I'd also love for someone to turn a bunch of questions from my draft Politics, Policy, and Security from a Broad Longtermist Perspective: A Preliminary Research Agenda into forecasting questions, and many would most naturally have horizons of >5 years.
This comment is again asking you to do most of the work, in the form of picking out which questions in that agenda are about the future and then operationalising them into crisp forecasting questions. But I'll add as replies a sample of some questions from the agenda that I think it'd be cool to operationalise and put on Metaculus.
On authoritarianism and/or dystopias
- What are the main pathways by which each type of authoritarian political system could reduce (or increase) the expected value of the long-term future?
- E.g., increasing the rate or severity of armed conflict; reducing the chance that humanity has (something approximating) a successful long reflection; increasing the chances of an unrecoverable dystopia.
- Risk and security factors for (global, stable) authoritarianism
- How much would each of the “risk factors for stable totalitarianism” reviewed by Caplan (2008) increase the ri
... (read more)