๐Ÿ“ฃ Final weekend - applications close Monday 12 Jan at noon (GMT)
๐Ÿš€ Apply your expertise to advanced AI governance.

Are you a mid-career or senior professional considering a transition into AI governance focused on advanced-AI risk?

Applications are closing imminently for the AI Governance Taskforce Research Associate role for our 12-week Winter 2026 cohort, starting end of January.

This role is designed for experienced professionals who want to apply existing domain expertise to frontier AI governance. You will work in small, high-calibre, multidisciplinary research teams producing policy-relevant research, led by Research Team Leaders and guided by expert practitioners based at respected organisations.

Projects:

๐Ÿ”น Phase identification of AI incident emergence
Sean McGregor (Founder, AI Incident Database)

๐Ÿ”น Leveraging open source intelligence (OSINT) for loss of control AI risk
Tommy Shaffer Shane (Interim Director of AI policy, Centre for Long-Term Resilience)

๐Ÿ”น Stress-testing a loss of control safety case
Henry Papadatos (Executive Director, SaferAI)

๐Ÿ”น Advancing dangerous capability red lines through the AI Safety Institute network
Su Cizem (Visiting Analyst, The Future Society)

๐Ÿ”น Developing international AI incident tracking and response infrastructure: the severity threshold โ€” technical triggers for โ€˜unacceptable riskโ€™
Caio Vieira Machado (Senior Associate, The Future Society)

What this involves:
๐Ÿ‘ฅ Applied research in small teams
๐Ÿ“„ Policy-relevant outputs (papers, briefs, articles)
๐ŸŒ Fully remote, global, part-time (8 hrs/week), 12 weeks
๐Ÿ’ผ Selective, volunteer-based research fellowship
๐Ÿ“… Deadline: Mon 12 January 2026, 12 noon GMT
๐Ÿ”— More info and apply: https://www.arcadiaimpact.org/ai-governance-taskforce

For queries, please email:
Ben R Smith, Taskforce Lead
ben [at] arcadiaimpact [dot] org


ย 

5

0
0

Reactions

0
0
Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities