Here’s a version of the database that you filter and sort however you wish, and here’s a version you can add comments to.
Update: I've been slow to properly update the database, but am collecting additional orgs in this thread for now.
Key points
-
I’m addicted to creating collections and have struck once more.
-
The titular database includes >130 organizations that are relevant to people working on longtermism- or existential-risk-related issues, along with info on:
- The extent to which they’re focused on longtermism/x-risks
- How involved in the EA community they are
- Whether they’re still active
- Whether they aim to make/influence funding, policy, and/or career decisions
- Whether they produce research
- What causes/topics they focus on
- What countries they’re based in
- How much money they influence per year and how many employees they have[1]
-
I aimed for (but likely missed) comprehensive coverage of orgs that are substantially focused on longtermist/x-risk-related issues and are part of the EA community.
-
I also included various orgs that are relevant despite being less focused on longtermism/x-risks and/or not being part of the EA community. But one could in theory include at least hundreds of such orgs, whereas I just included a pretty arbitrary subset of the ones I happen to know of.
-
I made this relatively quickly, based it partly on memory & guesswork, and see it as a minimum viable product that can be improved on over time. So please:
- If you spot any errors or if you know any relevant info I failed to mention about these orgs, let me know via an EA Forum message or via following this link and then commenting there
- Fill in this quick form if you know of other orgs worth mentioning.
- Let me know if you have questions about how best to use the database or how to interpret parts of it. (I expect many things will turn out to be confusing/unclear, and I’m relying on people to ask questions.)
Here’s a snippet of what the database looks like (from the "view" focused on "Funders/funding-influencers"):
I made this database and wrote this post in a personal capacity, not as a representative of my employers.
How, why, and when to use the database
(This is all how I use the database myself.)
You can filter, sort, and search the database based on the causes/topics and types of work (e.g., grantmaking vs policy advising vs research) you’re interested in.
You can use the database to:
- Generally learn about the landscape of actors in a given area
- Get ideas about what orgs could “provide inputs to you” (funding, advice, feedback, connections)
- Get ideas about what orgs could act as “nodes on your path to impact”, e.g. whose actions could be improved by a research project you’re considering doing or who could translate and transmit your findings on to key decision-makers
This could be useful in situations such as when you’re:
- Getting oriented to a new area
- Trying to build career capital in an area
- Generating project ideas, generating theories of change for those project ideas, and prioritising among them
- Conducting a project
- Helping someone else do any of the above things
(For elaboration on points 3 and 4 in the context of research projects specifically, see here, especially Slides 14-15. Those points are more relevant the more you aim to operate like a consultancy or think tank.)
These benefits could occur via:
- The database making you aware of orgs you didn’t know about
- The database making you aware of info you lacked on some orgs, or
- The database “jogging your memory”
- I find it’s easier to notice that an org is worth mentioning to someone I’m giving advice to or considering when making a project plan if I’m scanning a filtered list of maybe-relevant orgs than if I’m just doing free recall
Why I made this
Answer 1: As noted, I’m addicted to creating collections.
Answer 2: 18 months ago, I thought EAs should post more summaries and collections, and I still think that, and people seem to often like it when I do that.
Answer 3: 12 months ago, I made a smaller version of this database in hopes that it’d benefit the work of Rethink Priorities’ longtermism team (which I’m a part of) in the ways outlined in the previous section. I feel like it has indeed been useful (though mostly just through guiding my own work and my suggestions to other people; I think other people rarely use it directly). And I’ve also ended up fairly often using the database when giving career or project people advice (e.g., to remind myself what orgs I should suggest a person might want to talk to or check out the work of if they’re interested in nuclear risk or forecasting), or sharing snippets of it with people. So I figured I should make a publicly accessible version.
Caveats
Mainly just what I said earlier, but I’ll say it again in bold for good measure:
- I aimed for (but likely missed) comprehensive coverage of orgs that are substantially focused on longtermist/x-risk-related issues and are part of the EA community
- I also included various orgs that are relevant despite being less focused on longtermism/x-risks and/or not being part of the EA community. But one could in theory include at least hundreds of such orgs, whereas I just included a pretty arbitrary subset of the ones I happen to know of.
- I created this fairly quickly and based partly on memory & guesswork
Other caveats:
- A high level of focus on longtermism/x-risks and a high level of involvement in EA are of course neither necessary nor sufficient for an org to be impactful, “good”, wise, etc.
- Obviously I had to make many debatable judgement calls when filling the database in
- These orgs vary massively in their significance and in their relevance to longtermism-/x-risks
Possible next steps
- More orgs could be added (using this form)
- This could include drawing on GCRI's 2013 Organization Directory, organizations who have been included on 80k’s job board, and organizations featured in other relevant collections
- Info could be added and corrected (people can leave comments in the Airtable and then I’ll make the appropriate edits)
- Perhaps some other way to structure/display the info would be good?
- Perhaps this should be somehow integrated with other things, like 80k’s job board or my list of EA funding opportunities?
- People could duplicate and then adapt this database in order to make:
- A version that’s relevant to all EA cause areas
- A version that’s relevant to a particular other large EA cause area (e.g., animal welfare)
- A version that “zooms in on” some specific longtermist/x-risk-related area - adding more orgs, individuals, and info relevant to that area and cutting out other things
See also
If this database seems useful to you, you may also be interested in one or more of the following:
- A Database of EA Organizations & Initiatives (contains things relevant to cause areas other than longtermism/x-risks, but otherwise is less comprehensive/detailed)
- List of EA-related organisations (probably superseded by the above, more recent database)
- Job board - 80,000 Hours
- List of EA funding opportunities
- The EA Forum Wiki
- A central directory for open research questions
- Suggestion: EAs should post more summaries and collections
Acknowledgements
I drew on Pablo Stafforini’s and Jamie Gittins’ lists of EA-related orgs. An earlier version of the database benefitted from comments by Janique Behman, David Rhys Bernard, Juan Gil, and perhaps other people who I’m forgetting. The current version of the database and/or this post benefitted from comments from Will Aldred, Aaron Gertler, Jaime Sevilla, Ben Snodin, Pablo Stafforini, and Max Stauffer.
...well, I haven’t actually entered that info, but I’ve made fields for it in hopes of crowdsourcing it from you. ↩︎
Two lists I'm considering making:
Any thoughts?
I'll note:
I think that a lot of people just really can't understand or predict what would be useful without working in an EA org or in an EA group/hub. It took me a while! The obvious advice would be for people who want to really kickstart things, is to first try to work in or right next to an EA org for a year or so; then you'll have a much better sense.
Just throwing a thought: if many EA orgs have software needs and are struggling to employ people who'll solve them; and on the other hand, part-time employees or volunteer directories don't help that much - would it make sense to start a SaaS org aimed at helping EA orgs?
I could see a space for software consultancies that work with EA orgs, that basically help build and maintain software for them.
I'm not sure what you mean by SaaS in this case. If you only have 2-10 clients, it's sort of weird to have a standard SaaS business model. I was imagining more of the regular consultancy payment structure.
EA Software Consultancy: In case you don't know these posts:
Part 1
Part 2
Part 3
Yea, I was briefly familiar.
I think it's still tough, and agree with Ben's comment here.
https://forum.effectivealtruism.org/posts/kQ2kwpSkTwekyypKu/part-1-ea-tech-work-is-inefficiently-allocated-and-bad-for?commentId=ypo3SzDMPGkhF3GfP
But I think consultancy engineers could be a fit for maybe ~20-40% of EA software talent.
Working at an EA org to discover needs: This seems much slower than asking people who work there, no? (I am not trying to guess the needs myself)
It really depends on how sophisticated the work is and how tied it is to existing systems.
For example, if you wanted to build tooling that would be useful to Google, it would probably be easiest just to start a job at Google, where you can see everything and get used to the codebases, than to try to become a consultant for Google, where you'd ask for very narrow tasks that don't require you to be part of their confidential workflows and similar.
I agree I won't get everything
Still, I don't think Google is a good example. It is full of developers who have a culture of automating things and even free time every week to do side projects. This is really extreme.
A better example would be some organization that has 0 developers. If you ask someone in such an organization if there's anything they want to automate, or some repetitive task they're doing a lot, or an idea for an app (which is probably terrible but will indicate an underlying need) - things come up
But also, I tried, and I think 0 such needs surfaced
:)
Both sound to me probably at least somewhat useful! I'm ~agnostic on how likely they are to be very useful, how they compare to other things you could spend your time on, or how best to do them, which is mostly because I haven't thought much about software development.
I expect some other people in the community (e.g., Ozzie Gooen, Nuno Sempere, JP Addison) would have more thoughts on that. But it might make sense to just spend like 0.5-4 hours on MVPs before asking anyone else, if you already have a clear enough idea in your head.
I can also imagine having a Slack workspace / Slack channel in an existing workspace for people in EA who are doing software development or are interested in that could perhaps be useful.
(Sidenote: You may also be interested in posts tagged software engineering and/or looking into their authors/commenters.)
Great work Michael, I've already included this Airtable in the curriculum of Training For Good's upcoming impactful policy careers workshop. Well done, this work is of high value!
Glad to hear that you think this'll be helpful!
(Btw, your comment also made me realise I should add Training For Good to the database, so I've now done so. )
Also note that there are EA Forum Wiki entries for many of the orgs in this database, which will in some cases be worth checking out either for the text of the entry itself, for the links in the Bibliography section, or for the tagged posts.
Cool that you made this, and that you even made a Softr page! Although I think the Softr page is worse than just sharing a public grid view of the Airtable.
I realize it would be cool to have a similar database for all EA-related organisations. Jamie Gittins made one on Notion and has a Forum post here listing EA orgs, but they're both not easily filterable. It could have similar attributes to the Airtable you have. I saw that Taymon also has a Google Sheet, but it would be nice to have it on an Airtable and have it have more attributes, to make it more easily filterable and more colorful.
Can you share a public grid view of the Airtable in a way that allows people to filter and/or sort however they want but then doesn't make that the filtering/sorting that everyone else sees? I wasn't aware of how to do that, which is the sole reason I added the Softr option. I think the set of Airtable views I also link people to is probably indeed better if people are happy with the views (i.e., combos of filters and orders) that I've already set up.
Agreed that an all-of-EA version of this would also be useful, and that Airtable would be better for that than Notion, a Forum post, or a Google Sheet. I also expect it's something that literally anyone reading this could set up in less than a day, by:
You can share this link instead, which is better than the Softr view, and this means people don't need to get comment access to be able to view the Airtable grid. It also prevents people from being able to see each other's emails if they check the base collaborators. To find that link, I just pressed "Share" at the top right of the base, and scrolled down to the bottom of that modal/pop-up to find the link.
Ah, nice, thanks for that! It seems that that indeed allows for changing both "Filtered by" and "Sorted by", including from each of my pre-set views, without that changing things for other people, so that's perfect!
I still want to provide the comment access version as well, so people can more easily make suggestions on specific entries. But I'll edit my post to swap the softr link for the link you suggested and to make the comment access link less prominent.
No problem!
I suggested as one possible next step "People could duplicate and then adapt this database in order to make [a] version that’s relevant to all EA cause areas"
I think such a database has now been made! (Though I'm not sure if that was done by duplicating & adapting my one.) Specifically, Michel Justen has made A Database of EA Organizations & Initiatives. I imagine this'd be useful to some people who find their way to this post.*
Here's the summary section of their post, for convenience:
*I guess I should flag that I haven't looked closely at Michel's post or database, so can't personally vouch for its accuracy, comprehensiveness, etc.
I just wanted to leave a note saying that I found this database useful in my work.
Some orgs that should maybe be added (I'd be keen for someone to fill in the form to add them, including relevant info on them):
Epoch
Labour for the Long Term
EffiSciences
Palisade Research
"At Palisade, our mission is to help humanity find the safest possible routes to powerful AI systems aligned with human values. Our current approach is to research offensive AI capabilities to better understand and communicate the threats posed by agentic AI systems."
Jeffrey Ladish is the Executive Director.
Admond
"Admond is an independent Danish think tank that works to promote the safe and beneficial development of artificial intelligence."
"Artificial intelligence is going to change Denmark. Our mission is to ensure that this change happens safely and for the benefit of our democracy."
Senter for Langsiktig Politikk
"A politically independent organisation aimed at creating a better and safer future"
A think tank based in Norway.
Confido Institute
Epistea
Transformative Futures Institute
Led by Ross Gruetzemacher
SaferAI
Also AFTER (Action Fund for Technology and Emerging Risk)
Orthogonal: A new agent foundations alignment organization
Also Future Academy (but maybe that's not an org and instead a project of EA Sweden?).
Also anything in Alignment Org Cheat Sheet that's not in here. And maybe adding that post's 1-sentence descriptions to the info this database has on each org listed in that post.
Also fp21 and maybe Humanity Forward.
(Reminder: This is a database of orgs relevant to longtermist/x-risk work, and includes some orgs that are not part of the longtermist/x-risk-reduction community, don't associate with those labels, and/or don't focus specifically on those issues.)
Also Alvea and Nucleic Acid Observatory
Also Apollo Fellowship, Atlas Fellowship, Condor Camp, and
PathfinderSuccessifAlso Apollo Academic Surveys
Also AI Safety Field Building Hub and Center for AI Safety
Also Space Futures Initiative and Center for Space Governance
Apart Research
Also the European Network for AI Safety (ENAIS)
Riesgos Catastróficos Globales
International Center for Future Generations
As of today, their website lists their priorities as:
Also EA Engineers
Harvard AI Safety Team (HAIST), MIT AI Alignment (MAIA), and Cambridge Boston Alignment Initiative (CBAI)
These are three distinct but somewhat overlapping field-building initiatives. More info at Update on Harvard AI Safety Team and MIT AI Alignment and at the things that post links to.
Policy Foundry
The Collective Intelligence Project
Also Fund for Alignment Research
Also Cavendish Labs:
Also Institute for Progress
Also Encultured AI
Also Pour Demain
Also the Forecasting Research Institute
Also School of Thinking
To the best of my knowledge, Samotsvety is a group of forecasters, not an organization (although some of its members have recently launched or will soon launch forecasting-related orgs).
Also Research in Effective Altruism and Political Science (REAPS)
Times I have used this post in the course of my research: II.
Is that 11 or 2?
(Either way, thanks for letting me know :) )
2. Cheers.
See also Description of some organizations in or adjacent to long-term AI governance (non-exhaustive) (2021) (linked to from https://forum.effectivealtruism.org/posts/68ANc8KhEn6sbQ3P9/ai-governance-fundamentals-curriculum-and-application ).
How do I submit notes / corrections on orgs in the table?
"If you spot any errors or if you know any relevant info I failed to mention about these orgs, let me know via an EA Forum message or via following this link and then commenting there."
(The very first link I provide in this post allows changing the filtering & sorting, but not commenting, so you have to instead either send a message or use that other link.)
Thanks for your interest in suggesting extra info / correction :)