Hide table of contents

This post is part of a sequence on Meta Coordination Forum 2023. It summarizes pre-event survey respondents’ brainstorming on projects they’d like to see. 

You can read more about the pre-event survey results, the survey respondents, and the event here

About this survey section

We solicited project proposals from Meta Coordination Forum (MCF) 2023 attendees by asking a few closely related, optional questions. These included: 

  • What new projects would you like to see in EA?
  • What obviously important things aren’t getting done?
  • What projects should have existed a year ago?
  • What’s a “public good for EA” that nobody has a direct incentive to do but that would benefit lots of people?

The resulting list is not a definitive list of the best meta-EA projects; it’s more like a brainstorm and less like a systematic evaluation of options.

  • Respondents filled in their answers here quickly and may not endorse them on reflection. 
  • Respondents probably disagree with each other. We never asked respondents to evaluate the suggestions of others, but we’re pretty sure that if we had it would have revealed big disagreements. (There was significant disagreement on most other survey questions!) 
  • The value of these projects depends on how well they are executed and who owns them.

If someone is interested in taking on one of these projects and would like to connect with the person who proposed it, please reach outWe may be able to put you in touch.

Project Proposals

Coordination and Communication

  1. Projects focused on improving connections to groups outside EA (i.e. government, companies, foundations, media, etc.).
  2. A common knowledge spreadsheet of directly responsible individuals for important projects.
  3. More “public good”-type resources on the state of different talent pipelines and important metrics (e.g., interest in EA).  
  4. More coherent and transparent communication about the funding situation/bar and priorities.
  5. More effort going into identifying and making known low-integrity actors through some transparent mechanism.
  6. More effort into improving boards and reducing conflicts of interest across organizations / boards.
  7. More risk management capacity for EA broadly as a field and not just individual orgs. 

Career Advice and Talent Allocation

  1. Advanced 80K: Career advice targeted at highly committed and talented individuals.
  2. A separate organization that is an 80k analogue for mid-career people.

Community Engagement

  1. More creative and fun ways for young people to learn about EA principles that don’t place as much emphasis on doing "the single most important thing".
  2. More support and appreciation for people doing effective giving work (GWWC, Longview), and encouragement for others to do more of this. 
  3. A survey to identify why high-value potential members "bounce off" EA.
  4. A better way to platform great community members who can promote virtues and key principles.

AI Safety

  1. AI Safety Next Steps: A guide to facilitate entry into AI safety research and activism.
  2. Something to help people understand and evaluate the actions of AI labs, and possibly critique them. 
  3. An org that can hire and lightly manage independent researchers.
  4. A better understanding of the relevance of UK or EU AI policy on x-risk, and comparison to US policy.
  5. A really good book on AI risk.
  6. AGISF in workshop form.
  7. More AIS grantmaking. 
  8. A public policy institution advocating straightforwardly for the case of existential risk from AI.

Evaluation and Accountability

  1. More charity evaluators.
  2. More measurement and evaluation/accountability of meta projects.
  3. A public EA impact investing evaluator.

Fundraising and Donor Engagement

  1. More work on donor cultivation and fundraising.
  2. A new grantmaker with various beneficial attributes like speed, judgment ability, and infrastructure.
  3. More community building for effective giving. 

Education and Training

  1. Systematic educational/training materials and community building in areas outside AIS.
  2. Leadership fast-track program.

Media and Outreach

  1. A podcast to keep people updated on EA-related developments.
  2. A bunch of media platforms for sharing EA ideas (YouTube, podcast, Twitter, etc.).
  3. An analog of Non-trivial but for university students.
  4. Better on-ramps to the most impactful career paths.

Diversity and Inclusion

  1. An organization that specializes in improving ethnic, racial, and socioeconomic diversity within EA.

Other Initiatives

  1. A high-quality longtermist incubator.
  2. EAG-like cause-specific conferences.
  3. Fastgrants and other quick funding mechanisms.
  4. A post-FTX investigative unit.
  5. An awards program to create more appreciation within the community.
  6. More badass GHD obvious wins like Wave.
  7. An initiative that helps people prepare for crunch time and crises.
  8. More applied cause-prioritization work outside of Open Philanthropy. 
  9. More critiques of views closely associated with Open Philanthropy funding. 
  10. Cause-specific community-building organizations, analogous to what CEA does for EA.

(Reversed) What is a project or norm that you don’t want to see?

  • Incubators: One respondent stated that incubators are "super hard and over-done," mentioning that they are too meta and often started by people without entrepreneurial experience.
  • Making Donor Participation Onerous: One respondent is concerned that setting high standards for donors could make it difficult for new donors to contribute to EA, possibly leading to the shrinkage of the community.
  • Community Building and Early Funnel Bottlenecks: One respondent expressed the opinion that non-targeted community building may be overrated and that there may not be much of a bottleneck in the early stages of community funneling except for exceptional cases.
  • Community Building Projects Split: One respondent is, on the margin, against community building projects that are specifically focused on either neartermism or longtermism instead of broader EA.
Comments3
Sorted by Click to highlight new comments since: Today at 11:02 AM

Just gonna weigh in on some of these from my time researching this stuff at Nonlinear.

A common knowledge spreadsheet of directly responsible individuals for important projects.

Strongly agree. It's logistically easy to do, one person could cover 80% of EA projects within a week. I've been using AI Existential Safety Map (aisafety.world) a lot in my list of followups for 1-on-1s.

In the long run, a well-maintained wiki similar to/synced with the EA Opportunities Board (which I also heavily recommend) could make this really comprehensive.

More “public good”-type resources on the state of different talent pipelines and important metrics (e.g., interest in EA).  

I read every EA survey I see. They're often quite interesting and useful. I wouldn't say they're neglected since EAs do seem to love surveys, but usually a net positive.

More coherent and transparent communication about the funding situation/bar and priorities.

I am of the opinion that every EA funder should be as transparent and detailed about their funding bar/criteria as possible. Unlike for-profits/VCs, I don't see a strong reason for secrecy other than infohazards. It helps applicants understand what funders look for which helps both funders and applicants. I believe that applicant misconceptions about "what funders want" can hinder EA a lot in the long run due to mismatched incentives. I see a lot of compelling project directions censored/discarded in the early stages simply because applicants think they should be more generic (because being more generic works well in conventional success pathways).

More risk management capacity for EA broadly as a field and not just individual orgs. 

I really liked this post Cash and FX management for EA organizations — EA Forum (effectivealtruism.org) by @JueYan.

Advanced 80K: Career advice targeted at highly committed and talented individuals.

Agree, but I never figured out how to scalably execute this. Usually, if someone has a skillset+motive to do really well in EA, my priority is to 1. inform them of helpful resources to fill in themselves 2. try to link them with someone doing what they're trying to do.

The problem is that it seems hard to predict in advance who they'd consider a valuable connection. I think none of my most valuable connections in EA so far would've been referred to me by someone else.

Tractable idea: A list of helpful links sent to EAGx and EAG attendees post-conference.

A survey to identify why high-value potential members "bounce off" EA.

I actually have bounced off EA for 3 years before (2019-2022). For me, the big reason was that I couldn't find any follow-up steps to pursue (especially coming Singapore). My experience within EA is very inspiring and exciting interactions followed by not much follow-up (guidance, next steps, pursuing opportunities, encouraging people to start projects etc.).

[just gonna agree with all the AI Safety points, they've all come up before in my discussions]

 

Evaluation and Accountability

Shoutout to @Mo Putera who is working on this.

Media and Outreach

Casual observation that I can't recall a single EA social media account that I browse simply because it's fascinating and not because I wanna support EA on social media.

And I'm into weird stuff, too. I just binged hour-long videos on Soviet semiconductors and the history of Chang'an.

Incubators: One respondent stated that incubators are "super hard and over-done," mentioning that they are too meta and often started by people without entrepreneurial experience.

Agree, this point has been discussed in detail before. What we learned from a year incubating longtermist entrepreneurship — EA Forum (effectivealtruism.org)

I think it's just hard to do well because there's so many points of failure, it takes a long time for any results to show and it requires both social skills and technical expertise. That said, I do think a longtermist version of Charity Entrepreneurship seems promising to pilot (actually, I'm gonna bring this up to Kat Woods right now).

Fastgrants and other quick funding mechanisms.

I really like Manifund as a platform!

Thanks for adding these thoughts!

A quick note to say that I’m taking some time off after publishing these posts. I’ll aim to reply to any comments from 13 Nov.

More from michel
Curated and popular this week
Relevant opportunities