Hide table of contents

We’ve rated global priorities research (GPR) as one of our top priority areas for some time, but over the last couple of years I’ve come to see it as even more promising.

The field of GPR is about rigorously investigating what the most important global problems are, how we should compare them, and what kinds of interventions best address them. For example, how to compare the relative importance of tackling global health vs. existential risks.

It also considers questions such as how much weight to put on longtermism, or whether we should give now or later.

I’d be keen to see more investment in the field, both in absolute terms and relative to the portfolio of effort within the effective altruism community.

Here are some reasons why. Each reason is weak by itself, but taken together they’ve caused me to shift my views.

Positive recent progress

I think the Global Priorities Institute has made good progress, which makes me optimistic about further work.

One form of work is putting existing ideas about global priorities on a firmer intellectual footing, of which I think Hilary Greaves and Will MacAskill’s strong longtermism paper is a great example. This kind of work is useful because it encourages the ideas to be taken seriously within academia, and also helps to uncover new flaws in them.

Another form of work is aimed at directly changing the priorities of the effective altruism community or other altruists. I think Philip Trammell’s work on optimal timing is a success, and might significantly change how we want to allocate resources. Another example is Will MacAskill’s work on whether we’re at the most influential time in history.

Implications of longtermism

I’ve come to better appreciate how little we know about what longtermism implies. Several years ago, it seemed clearer that focusing on reducing existential risks over the coming decades—especially risks posed by AI—was the key priority.

Now, though, we realise there could be a much wider range of potential longtermist priorities, such as patient longtermism, work focused on reducing risk factors rather than risks, or other trajectory changes. There has been almost no research on the pros and cons of each.

There are also many crucial considerations, which could have huge implications for our priorities. I could see myself significantly changing my views if more research was done.

Patient longtermism

I now put a greater credence in patient longtermism compared to the past (due to arguments by Will MacAskill and Phil Trammell and less credence in very short AI timelines), which makes GPR look more attractive. (And in general GPR seems more robustly good across a variety of forms of longtermism, except for the most urgency-focused forms.)

Relative neglectedness

AI safety has caught on more broadly, while GPR hasn’t. Because of that, the resources invested in AI safety seem have substantially increased over the last few years, decreasing its neglectedness, while GPR seems to have seen a smaller increase.

Scale of the community

At a lower bound, we can think of GPR as a multiplier on the effectiveness of the rest of the effective altruism community, and so the larger the community, the more valuable the research.

There are now hundreds of millions of dollars spent by the community each year, and thousands of community members doing direct work (probably several fold higher than 5 years ago), and this research can have real effects on what they do. The research can also be applied beyond the community, so it hopefully has even more potential than this increase would imply.[1]

Importance of ideas

If anything, I’m even more convinced that the ideas are what matter most about EA, and that there should at least be a branch of EA that’s focused on being an intellectual project. The field of GPR is perhaps our best chance of being this project, and either way, it helps to put EA on a firmer intellectual footing.


Is there anything that has made me less keen on GPR in the last few years? There are a couple of factors, but I think they’re small.

One issue is that it’s still proving hard to attract academic economists into the field, though there has been some progress.

Some have pointed out that there haven’t been paradigm shifting new arguments in the last couple of years, perhaps suggesting progress is harder than expected, though I think progress has been reasonable or good compared to my expectations.

Another issue is that it’s difficult for most people to contribute to the field, and this bottleneck makes it hard for many more people to contribute to it than do today. Still, there are ways that more people can get involved, and anyone can contribute through donating.

Here’s some more detail on how people can contribute:

How might you contribute?

  • If you’re on track to be an academic researcher, you can seek out relevant topics. Economics and philosophy are the most obviously relevant fields, but there should be useful work in a much wider range of fields. The Global Priorities Institute focuses on the most fundamental questions (see their research agenda), but there is also plenty of applied research to be done in comparing different issues, such as those we list here. There is some more detail on how to pursue the academic path in our priority path write up and profile on academic research.

  • If you know anyone who might be able to do this kind of research (or you have a public platform with this kind of audience), you could consider telling them about the field and why it matters. Many researchers dream of doing work that’s both intellectually fascinating and has major real-world applications.

  • You can take a supporting job at one of the organisations that does this kind of research, such as the Global Priorities Institute or Open Philanthropy. There are not many of these organisations currently, but I expect that the field will grow over the next 10 years, and more centres will be established in a number of universities around the world—so it could be worth bearing in mind that career capital in research management may be relevant down the line.

  • You can see a list of jobs in research and supporting roles here.

  • One way to help with global priorities outside of a formal research setting is to test out a project within an unexplored problem area, to help work out if more people should enter that area in the future.

  • You can donate to the Global Priorities Institute at Oxford. Besides carrying out research, they also have scholarships to support the careers of young researchers at other institutions.

  • GPI recently received a grant from Open Philanthropy, but Open Philanthropy are usually not willing to cover 100% of an organisation’s budget (and this is a precarious position to be in too), so GPI are planning to raise another about £1m over the next months. I expect they could fund more scholarships and positions beyond this, and further diversify their funding base, though with diminishing returns. Note that two of our trustees work at the Global Priorities Institute, so we may be biased.

  • In the future, I hope there will be opportunities to fund new academic centres. This is another potential use of the Global Priorities Institute’s funding, though you could also tell them you’d be interested in doing this when the time comes, and invest your money in a DAF until then.

Crossposted from the 80,000 Hours blog.

Footnotes


  1. Strictly speaking, even if we just focus on the community, the immediate scale of the community is not what’s most relevant. We care more about something like the integral of the scale of the community over its entire future, and research discoveries made today only speed up future discoveries. It’s less obvious that GPR is higher impact based on this analysis, though the current scale of the community is a relevant factor that’s easier to measure. ↩︎

Comments10
Sorted by Click to highlight new comments since:

Thank you for sharing more about GPI's priorities and non-Open Phil fundraising goals for this year. Our family will plan to contribute in November or December, after focusing on some other non-profit investments in the next few months.

To borrow a page from political fundraising in the U.S., it could make sense to create formal or informal recognition strategies (along the lines of 80,000 Hours's "Our donors" page) or social opportunities for donors to GPI - whether on the GPI site or on a Medium page a supporter might roguely maintain if that's easier. Perhaps a fundraiser "Zoom" for $500 or $1,000 a head, where guests could have the chance to meet each other and ask questions of one or more game members of the GPI team? I'd be happy to help organize one of those if helpful.

Also: one suggested edit for the GPI team, in the tiny chance it has an infinitesimal impact on someone's decision re: how much or whether to give to GPI. On the following page, " We are very greatful for any support!" should read "grateful." https://globalprioritiesinstitute.org/supporting-gpi/

Thanks, it's great you're planning to contribute! I've also let GPI know about your feedback.

I just wanted to explicitly add to this post that valuable GPR can, does, and should happen outside of an academic setting. I think this is implied in this post (e.g. the mention of OpenPhil and the link to the GPR roles on the 80k website), but is not quite explicit, so I just wanted to flag it. Researchers outside of academia face a different set of incentives to academics, and can sometimes have more freedom to work on questions that are more practically relevant but less 'publishable' in academic journals. The point is made quite nicely on the 80k GPR page here: https://80000hours.org/problem-profiles/global-priorities-research/#what-are-some-top-career-options-within-this-area

"That said, we expect that other centres will be established over the coming years, and you could also pursue this research in other academic positions.

One downside of academia, however, is that you need to work on topics that are publishable, and these are often not those that are most relevant to real decisions. This means it’s also important to have researchers working elsewhere on more practical questions."

Personally, I think/hope the field of GPR will develop in a similar way to 'impact evaluation' in development economics over the last ~20 years -- i.e. significant progress has been made in academic research (including some of the more important methodological or foundational advances), but there has also been a lot of valuable non-academic impact evaluation research (including lots that is more directly relevant for decision-makers).

Thank you for writing this Ben. I strongly agree with this.

I have also been thinking and writing about this for the past few weeks. And so, in a fit of self-promotion and/or pointing readers to similar work, direct anyone interested to my post here. I suggest that the EA movement has not done enough in this space, lay out some areas I would like to see researched, make the case that new organisations (or significant growth of existing organisations) are needed and look at some of the challenges to making that happen.

This was written pretty recently and I still agree with it!

I'm doing a series of recordings of EA Forum posts on my "found in the struce" podcast, also delving into the links and with my own comments.

I've just done an episode on the present post HERE

I also did one on @weeatquice's post HERE

Let me know your thoughts, and if its useful. I think you can also engage directly with the Anchor app leaving a voice response or something.

[anonymous]2
0
0
There are not many of these organisations currently, but I expect that the field will grow over the next 10 years, and more centres will be established in a number of universities around the world

What's the basis of this claim?

I think there are donors in the community who will fund this work if we can find people to run these centres (e.g. similar people to those who funded GPI).

I think we can find more people able to run more centres over 10 years. My evidence for this is mainly that we have managed to find people in the past (e.g. the people who work at GPI) and I expect that to continue. I also think GPI is making progress finding people through their seminars and fellowships. Many of these people are junior now, but in 10 years some will be senior enough to found new centres.

[anonymous]1
0
0

Okay, that's great! Thanks :)

This post advocates for greater prioritization of global priorities research, including questions related to longtermism, because such research can increase the impact of the EA community.

The Positive recent progress section implies that research is thought of as traditional philosophical academic journal paper and similar writing and further suggests that innovative discourse happens predominantly within academia. This thinking could hinder progress within the GPR field.

The Implications of longtermism and Patient longtermism sections can be interpreted as seeking to gain popular attention as compared to inviting collaborative engagement of readers. This can discourage readers interested in helping others from engaging with the ideas presented in the piece.

The Relative neglectedness part makes a comparison of the recent growth of resources dedicated to GPR and AI safety in the EA community. While this can be a true statement, it does not imply that GPR should be prioritized: rather than the comparison of the increase in resources deployed in each field, the marginal value of additional GPR effort should be considered.

The Scale of the community part assumes that EA resources are always perfectly mobile and thus neglects the notion of suboptimal institutionalized thinking within EA due to GPR deprioritization at any earlier stages, which can have significant negative implications also outside of EA, due to the community leverage.

The Importance of ideas segment states limited interest of some academics in GPR research, while omitting a broader reflection on recent development in global re-prioritization.

The How might you contribute? section uses a language that connotes the author’s request that readers contribute by 1) identifying GPR topics, 2) asking others to conduct GPR, 3) applying for a junior role at a GPR research organization endorsed by 80,000 Hours, 4) researching the importance of a relatively unexplored issue, or/and 5) donating. Using a request language may hurt readers’ prioritization rationality and reduce their long-term engagement with the field.

Thus, this piece makes a sincere appeal but lacks robust arguments to support its thesis.

Curated and popular this week
Relevant opportunities