I’m Michael Aird, a Senior Research Manager at Rethink Priorities, Research Scholar at the Future of Humanity Institute, and guest fund manager at the Effective Altruism Infrastructure Fund. Opinions expressed are my own. You can give me anonymous feedback at this link.
With Rethink, I'm currently mostly working on nuclear risk research and AI governance & strategy research.
I also post to LessWrong sometimes.
If you think you or I could benefit from us talking, feel free to message me or schedule a call. For people interested in doing EA-related research/writing, testing their fit for that, "getting up to speed" on EA/longtermist topics, or writing for the Forum, I also recommend this post.