M

MichaelA🔸

12597 karmaJoined Working (6-15 years)

Posts
117

Sorted by New

Sequences
4

Nuclear risk research project ideas
Moral uncertainty
Risks from Nuclear Weapons
Improving the EA-aligned research pipeline

Comments
2491

Topic contributions
793

Nice post! If someone wants AI governance/safety recommendations, feel free to message me. There's a set of orgs I'm confident are things that (a) non-OP donors are better suited than OP to funding and (b) people like me and OP grantmakers think are good. These are what I give to myself. Up to a given person whether they want my suggestions, of course!

(I was previously a grantmaker for EA Funds, and have been in the AI governance space for a few years.)

Hi Richard, quick reactions without having much context:

  • If you mean this is all one company, this sounds like putting too many eggs in one basket, and insufficiently exploring. 
    • I think it's generally good to apply to many different types of roles and organizations
    • Sometimes it makes sense to focus in mostly on one role type or one org. But probably not entirely. And not once one has already gotten some evidence that that's not the right fit. (Receiving a few rejections isn't much negative info, but if it's >5 for one particular org or type of thing then that's probably at least enough evidence that one should also apply to lots of other things and not spend lots of further time on this one thing.)
  • I'd be much less focused on "am I annoying them?" than "Am I spending too much of my valuable time on this one type of thing, and also potentially missing lots of other better-fitting things elsewhere?"

The AI Safety Fundamentals opportunities board, filtered for "funding" as the opportunity type, is probably also useful. 

Oh wow, thanks for flagging that, fixed! Amazing that a whole extra word in the title itself survived a whole year, and survived me copy-pasting the title in various other places too 😬

AI Safety Support have a list of funding opportunities. I'm pretty sure all of them are already in this post + comments section, but it's plausible that'll change in future. 

Yeah, the "About sharing information from this report" section attempts to explain this. Also, for what it's worth, I approved all access requests, generally within 24 hours.

That said, FYI I've now switched to the folder being viewable by anyone with the link, rather than requiring requesting access, though we still have the policies in "About sharing information from this report". (This switch was partly because my sense of the risks vs benefits has changed, and partly because we apparently hit the max number of people who can be individually shared on a folder.)

AI Safety Impact Markets

Description provided to me by one of the organizers: 

This is a public platform for AI safety projects where funders can find you. You shop around for donations from donors that already have a high donor score on the platform, and their donations will signal-boost your project so that more donors and funders will see it. 

See also An Overview of the AI Safety Funding Situation for indications of some additional non-EA funding opportunities relevant to AI safety (e.g. for people doing PhDs or further academic work). 

Load more