Hide table of contents

There is a particular irony in watching a field dedicated to reducing existential risk quietly reproduce the same patterns of exclusion it claims to be solving.

AI safety and governance funding is about to grow dramatically. Between new philanthropic vehicles, the Anthropic and OpenAI tender processes and shortening timelines to transformative AI, more money is entering this space than ever before. That is genuinely good news. But if we are not intentional about how it moves, this flood of funding risks making existing blind spots bigger rather than fixing them.

The question is not whether the money will arrive. It will. The question is whether the right systems exist to direct it well — and right now, they are not yet ready for what is coming.

The default will be to fund the already funded

When money moves fast, it follows familiar paths. It flows toward organisations with established reputations, researchers with institutional homes and people who already have access to the right rooms. This is not deliberate exclusion. It is simply how capital behaves when there are no intentional systems in place to direct it otherwise.

The challenge is that those familiar paths in AI governance currently run through a relatively small number of institutions concentrated in a handful of cities. Meanwhile the technology being governed will touch billions of people living far outside those cities — in Lagos, Nairobi, Jakarta, São Paulo and countless other places where AI systems are already reshaping daily life, often without adequate frameworks to manage the consequences.

There is a growing group of practitioners in these regions who are building AI governance capacity without institutional backing, without salaries and without access to the networks that open funding doors. They are drafting governance frameworks, running AI literacy programmes, contributing to global policy conversations and publishing work that genuinely adds to the field.

They are doing the work. They are just not getting funded for it.

Why current funding systems miss them

EA and longtermist funding was built around a specific type of applicant. Someone with a publication record. Someone attached to a recognised institution. Someone whose credibility is already legible within the existing ecosystem.

This design is not intentionally exclusive — but it consistently advantages people who had access to the institutions that produce those credentials. The result is a funding landscape where the real question being asked is not always "is this person doing valuable work?" but rather "can we verify this person through channels we already trust?" These are not the same question. Treating them as if they are is how capable, committed practitioners get filtered out — not because their contributions lack value but because that value has not yet appeared in the formats the system is built to recognise.

The coming funding torrent will not fix this on its own. Without deliberate redesign, it will simply amplify the existing pattern at greater scale.

What better systems would look like

This moment is a genuine opportunity to rebuild rather than simply expand. Here is what more effective infrastructure could look like in practice.

First — evaluate the work, not just the credentials. A governance document drafted for an ethics organisation, an essay published on the EA Forum, a webinar series reaching practitioners in underserved regions — these are real, assessable outputs. The infrastructure to evaluate them directly just needs to exist and be consistently used rather than defaulting to institutional proxies.

Second— treat geographic diversity as strategy, not goodwill. AI systems will operate globally. The frameworks that govern them will be stronger — not just more equitable, but genuinely more robust — if they are shaped by people who understand different regulatory environments, cultural contexts and real-world failure modes. A governance framework stress-tested only against the assumptions of one region is a framework with blind spots. Funding diversity is not an act of charity. It is epistemically sound investment.

Third— use community signals as verification. The EA community has already built strong tools for evaluating ideas — forum contributions, consistent engagement, peer recognition over time. These same signals can serve as legitimate verification for practitioners who lack formal institutional affiliations. Someone who has been contributing thoughtfully and consistently to governance conversations, producing public work and demonstrating depth of knowledge over time has a verifiable track record. It simply requires a different — and broader — definition of what a track record looks like.

The specific risk of a funding flood

Large, fast-moving capital has a known failure mode. It generates activity that looks like progress without necessarily producing it. When money becomes abundant, the incentive quietly shifts — from doing the most valuable work to being visible to the people writing the cheques. Organisations optimise for fundability. Individuals optimise for recognition in the right channels. The field fills with well-resourced projects that reinforce what already exists rather than building what is genuinely missing.

The solution is not less funding. It is smarter infrastructure for directing it — systems that can find and evaluate talent wherever it exists, move quickly enough to support emerging practitioners before they burn out or disengage, and actively resist the gravitational pull toward the already-established.

The people who will help solve the governance challenges of advanced AI are not all in the places we expect to find them. Some of them are already doing the work — quietly, without funding, without institutional cover — simply because they believe it matters.

The torrent is coming. The question is whether it will reach them.

"This essay was developed with AI assistance (Claude, Anthropic)."

0

0
0

Reactions

0
0

More posts like this

Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities