I’m Michael Aird, a Staff Researcher at Rethink Priorities, Research Scholar at the Future of Humanity Institute, and guest manager at the Effective Altruism Infrastructure Fund. Opinions expressed are my own. You can give me anonymous feedback at this link.

With Rethink, I'm currently mostly working on nuclear risk research. I might in future work on topics related to what I'm calling "Politics, Policy, and Security from a Broad Longtermist Perspective".

Previously, I did longtermist macrostrategy research for Convergence Analysis and then for the Center on Long-Term Risk. More on my background here.

I also post to LessWrong sometimes.

If you think you or I could benefit from us talking, feel free to message me or schedule a call. For people interested in doing EA-related research/writing, testing their fit for that, "getting up to speed" on EA/longtermist topics, or writing for the Forum, I also recommend this post.


Risks from Nuclear Weapons
Improving the EA-aligned research pipeline

Wiki Contributions


Long-range forecasting

Yeah, I think that that'd work for this. Or maybe to avoid proliferation of tags, we should have forecasting and forecasts, and then just long-range forecasting, and if people want to say something contains long-range forecasts they can use long-range forecasting along with forecasts

Propose and vote on potential EA Wiki entries

I do see this concept as relevant to various EA issues for the reasons you've described, and I think high-quality content covering "the value of open societies, the meaning of openness, and how to protect and expand open societies" would be valuable. But I can't immediately recall any Forum posts that do cover those topics explicitly. Do you know of posts that would warrant this tag?

If there aren't yet posts that'd warrant this tag, then we have at least the following (not mutually exclusive) options:

  1. This tag could be made later, once there are such posts
  2. You could write a post of those topics yourself
  3. An entry on those topics could be made
    • It's ok to have entries that don't have tagged posts
    • But it might be a bit odd for someone other than Pablo to jump to making an entry on a topic as one of the first pieces of EA writing on that topic?
      • Since wikis are meant to do things more like distilling existing work.
      • But I'm not sure.
      • This is related to the question of to what extent we should avoid "original research" on the EA Wiki, in the way Wikipedia avoids it
  4. Some other entry/tag could be made to cover similar ground
Long-range forecasting

Should this tag be applied to posts that contain (links to) multiple thoughtful long-range forecasts but don't explicitly discuss long-range forecasting as distinct from forecasting in general? E.g., did it make sense for me to apply it to this post

(I say "thoughtful" as a rough way of ruling out cases in which someone just includes a few quick numbers merely to try to give a clearer sense of their views, or something.)

I think LessWrong have separate tags for posts about forecasting and posts that contain forecasts. Perhaps we should do the same?

Propose and vote on potential EA Wiki entries

My personal, quick reaction is that that's a decently separate thing, that could have a separate tag if we feel that that's worthwhile. Some posts might get both tags, and some posts might get just one.

But I haven't thought carefully about this.

I also think I'd lean against having an entry for that purpose. It seems insufficiently distinct from the existing tags for career choice or community experiences, or from the intersection of the two.

Propose and vote on potential EA Wiki entries

Actually, having read your post, I now think it does sound more about jobs (or really "roles", but that sounds less clear) than about careers. So I now might suggest using the term job profiles

You should write about your job

I think the MVP version you describe sounds good. I'd add that it seems like it'd sometimes/often be useful for people to also write some thoughts on whether and why they'd recommend people pursue such jobs? I think these posts would often be useful even without that, but that could sometimes/often make them more useful. 

You should write about your job

Yeah, I definitely expect it'd be worth many people doing this! 

I also tentatively suggested something somewhat similar recently in a shortform. I'll quote that in full:

Are there "a day in the life" / "typical workday" writeups regarding working at EA orgs? Should someone make some (or make more)?

I've had multiple calls with people who are interested in working at EA orgs, but who feel very unsure what that actually involves day to day, and so wanted to know what a typical workday is like for me. This does seem like useful info for people choosing how much to focus on working at EA vs non-EA orgs, as well as which specific types of roles and orgs to focus on. 

Having write-ups on that could be more efficient than people answering similar questions multiple times. And it could make it easier for people to learn about a wider range of "typical workdays", rather than having to extrapolate from whoever they happened to talk to and whatever happened to come to mind for that person at that time.

I think such write-ups are made and shared in some other "sectors". E.g. when I was applying for a job in the UK civil service, I think I recall there being a "typical day" writeup for a range of different types of roles in and branches of the civil service.

So do such write-ups exist for EA orgs? (Maybe some posts in the Working at EA organizations series serve this function?) Should someone make some (or make more)?

One way to make them would be for people think about career options to have the calls they would've had anyway, but ask if they can take more detailed conversation notes and then post them to the Forum. (Perhaps anonymising the notes, or synthesising a few conversations into one post, if that seems best.) That might allow these people to quickly provide a handy public service. (See e.g. the surprising-to-me number of upvotes and comments from me just posting these conversation notes I'd made for my own purposes anyway.)

I think ideally these write-ups would be findable from the Working at EA vs Non-EA Orgs tag. 

I think the key difference between my shortform and yours is that your suggestion is broader than just "typical day in the life" or just EA org jobs. I think it's indeed better to suggest something that's broader in those two ways. (I had just had in mind what happened to stand out to me that day after a call with someone.) 

Btw, Jamie Harris noted in a reply to my shortform: 

Animal Advocacy Careers skills profiles are a bit like this for various effective animal advocacy nonprofit roles. You can also just read my notes on the interviews I did (linked within each profile) -- they usually just start with the question "what's a typical day?" https://www.animaladvocacycareers.org/skills-profiles

So those profiles might be of interest to people on the object-level or as examples of what these posts could look like. (Though I don't think anyone should really need to see an example, and I haven't actually read any of those profiles myself.) 

Propose and vote on potential EA Wiki entries

Yeah, this seems worth having! And I appreciate you advocating for people to write these and for us to have a way to collect them, for similar reasons to those given in this earlier shortform of mine.

I think career profiles is a better term for this than job posts, partly because:

  • The latter sounds like it might be job ads or job postings
  • Some of these posts might not really be on "jobs" but rather things like being a semi-professional blogger, doing volunteering, having some formalised unpaid advisory role to some institution, etc.

OTOH, career profiles also sounds somewhat similar to 80k's career reviews. This could be good or bad, depending on whether it's important to distinguish what you have in mind from the career review format. (I don't have a stance on that, as I haven't read your post yet.)

Books and lecture series relevant to AI governance?

Thanks Mauricio!

(Btw, if anyone else is interested in "These histories of institutional disasters and near-disasters", you can find them in footnote 1 of the linked post.)

Books and lecture series relevant to AI governance?

Here are some relevant books from my ranked list of all EA-relevant (audio)books I've read, along with a little bit of commentary on them.

  • The Precipice, by Ord, 2020
    • See here for a list of things I've written that summarise, comment on, or take inspiration from parts of The Precipice.
    • I recommend reading the ebook or physical book rather than audiobook, because the footnotes contain a lot of good content and aren't included in the audiobook
    • Superintelligence may have influenced me more, but that’s just due to the fact that I read it very soon after getting into EA, whereas I read The Precipice after already learning a lot. I’d now recommend The Precipice first.
  • Superintelligence, by Bostrom, 2014
  • The Alignment Problem, by Christian, 2020
    • This might be better than Superintelligence and Human-Compatible as an introduction to the topic of AI risk. It also seemed to me to be a surprisingly good introduction to the history of AI, how AI works, etc.
    • But I'm not sure this'll be very useful for people who've already read/listened to a decent amount (e.g., the equivalent of 4 books) about those topics.
    • This is more relevant to technical AI safety than to AI governance (though obviously the former is relevant to the latter anyway).
  • Human-Compatible, by Russell, 2019
  • The Strategy of Conflict, by Schelling, 1960
    • See here for my notes on this book, and here for some more thoughts on this and other nuclear-risk-related books.
    • This is available as an audiobook, but a few Audible reviewers suggest using the physical book due to the book's use of equations and graphs. So I downloaded this free PDF into my iPad's Kindle app.
  • Destined for War, by Allison, 2017
    • See here for some thoughts on this and other nuclear-risk-related books, and here for some thoughts on this and other China-related books.
  • The Better Angels of Our Nature, by Pinker, 2011
    • See here for some thoughts on this and other nuclear-risk-related books.
  • Rationality: From AI to Zombies, by Yudkowsky, 2006-2009
    • I.e., “the sequences”
  • Age of Ambition, by Osnos, 2014
    • See here for some thoughts on this and other China-related books.
Load More