Hide table of contents

This post is a retrospective on the Longtermist Entrepreneurship (LE) Project, which ran for a year and explored ways to incubate new longtermist entrepreneurship. If you’re in a hurry, we recommend reading key lessons learned, what we’d be excited about, and what it takes to work in this space.

Thanks to Markus Anderljung, Aaron Gertler, Sam Hilton, Josh Jacobson and Jonas Vollmer for reviewing, as well as many others who reviewed an earlier draft of the document. All opinions and mistakes are our own.

Intro

The Longtermist Entrepreneurship (LE) Project ran from April 2020 through May 2021, with the aim of testing ways to support the creation of new longtermist nonprofits, companies, and projects. During that time, we did market sizing, user interviews, and ran three pilot programs on how to support longtermism entrepreneurship, including a fellowship. The LE Project was run by Jade Leung, Ben Clifford, and Rebecca Kagan, and funded by Open Philanthropy. The project shut down after a year because of staffing reasons, but also because of some uncertainty about the project’s direction and value.

We never had a public internet presence, so this may be the first time that many people on the EA Forum are hearing about our work. This post describes the history of the project, our pilot programs, and our lessons learned. It also describes what we’d support seeing in the future, and what our concerns are about this space, and ways to learn more.

Overall, we think that supporting longtermist entrepreneurship is important and promising work, and we expect people will continue to work in this space in the coming years. However, we aren't publishing this post because we want to encourage lots of people to start longtermist incubators. We think doing longtermist startup incubation is incredibly difficult, and requires specific backgrounds. We wanted to share what we’ve transparently and widely to help people learn from our successes and mistakes, and to think carefully about what future efforts should be made in this direction.

If you’re considering starting an LE incubator[1], we’d love to hear about it so we can offer advice and coordination with others interested in working in this space. Please fill out this google form if you’re interested in founding programs in LE incubation.

Key lessons learned:

  • Overall, it’s likely that one or multiple organizations should be doing LE incubation. We need more longtermist organizations, and the current ecosystem doesn’t seem poised to fix this problem. Our fellowship and matchmaking pilots were promising, suggesting that there’s more we can do to start new organizations.
  • There’s interest in LE programs, but a limited talent pool that has strong backgrounds in both longtermism and entrepreneurship. Talent is likely to be a significant bottleneck. Hundreds of people expressed interest in doing LE, but a very small number of these (1-3 dozen) had backgrounds in both longtermism and entrepreneurship. There were few people that we thought could pull off very ambitious projects.
  • The idea pool is more limited and less developed than we expected. There are existing lists of ideas, but almost no ideas are fleshed out and have broad support. There are no clear “highest priority ideas'' that are obviously good to pursue and have been carefully vetted. Instead, most people we spoke to thought that the most promising ideas depended on the available talent. We found almost no longtermist ideas for traditional startup-minded people to pursue.
  • Funders are worried about downside risks of some new projects, but often more open to funding short-runway projects with frequent checkpoints. Funders do want to see more ambitious new projects, but in cases where there’s potential risks of doing harm some funders will be hesitant to support new organizations or untested ideas without supervision. Supervision could involve shorter runways with frequent check-ins and more active oversight. In some cases funders may be more open to taking risks with new projects if the founders are unusually well-aware of risks, have active trusted advisors, and test out risky actions in small-scale, temporary ways.
  • An incubator’s attention should be focused on starting new orgs (incubation) rather than supporting existing startups (acceleration). There aren’t a large number of existing longtermist startups, and those that are the most promising are mostly already able to access the support and networks they need. In addition, providing high-quality one-on-one support isn’t very scalable.
  • The LE Project should be divided into several different organizations/efforts, rather than one incubator. The LE Project tried to cover a broad mandate that we now believe should be split between several different organizations. We’d be more excited about groups that have a more precise vision for a specific program to run, rather than trying to cover the whole area.
  • Variants of this idea we think could go badly, and would be concerned to see:
    • A traditional Y Combinator for EA / longtermism — based on differences in talent pool, method for finding ideas, and risk appetite. See more on this in "What we’d be concerned about,” below.
    • An incubator encouraging ideas that the staff hasn’t vetted
    • An incubator encouraging people to do ambitious projects they aren’t able to execute on well
  • Working on incubation requires a background that few people have (including us). This work requires experience starting organizations and advising startups, strong longtermist knowledge, strong EA commitment and network, as well as other characteristics. We’d be skeptical of any incubator led by a team without experience in the specific area where they’re planning to start projects.

Contents

Overview & motivation

The Longtermist Entrepreneurship Program ran for one year, with a team of 1-3 people throughout the year. It was supported by the Open Philanthropy project with an initial grant of $180,000 to Jade Leung in February 2020 and follow-on funding of $500,000 in September 2020. The EA Funds Long-Term Future Fund also made $200,000 available for pass-through grants to individual entrepreneurs, though the majority of this funding was not used. We were supported by the fiscal sponsorship of the Centre for Effective Altruism.

In total, $300,000 was spent on the LE Project so far, including staff time and passthrough grants.[2] A total of roughly 2 FTE’s staff time for one year was spent (24 months of full-time effort).[3]

Our high-level goal in this project was to create a robust ecosystem for longtermist entrepreneurship — which we define as the creation of new organizations and projects that are good for the long-term future of humanity (including projects in specific areas of longtermism like AI risk and bio, projects to increase civilization resilience, as well as some meta EA and meta longtermism projects). We define “robust ecosystem for LE” as a world in which high impact, risk-conscious longtermist organizations are started regularly to address key gaps in longtermism, and when the longtermist community is able to implement solutions at at least the pace it generates them. Some indicators of that ecosystem include:

  • Talent: We have enough talent to act on our ideas, and pipelines to generate longtermist entrepreneurs in the future.
  • Ideas: The community generates promising ideas, shares them with people who might implement them, and can evaluate which ideas should be implemented.
  • Funding: Funders bet on promising longtermist (LT) ventures, and founders know how to access funding.

Based on our personal experiences, and conversations with longtermist / EA community members, we believe that the LE ecosystem is not currently robust, and that this problem may not resolve on its own in the near future without the support of LE incubation programs. That’s why we were motivated to work on the LE Project.

The incubator shut down in May 2021, primarily for staffing reasons, but also because of some uncertainty about the value of the program on the part of one of the founders. One founder left in the fall of 2020 to pursue an unexpected job offer that they thought was higher impact. One decided to attend grad school, and the third wasn’t interested in being a solo founder. For the most part, these decisions were motivated by personal life circumstances, but it is possible that some or all of the founders would have stayed if the work had been more promising or tractable.

Pilot programs run

LE Survey

In June 2020, we sent out two surveys to assess the demand for longtermist entrepreneurship. We received >250 responses from people who identified as possible or current longtermist entrepreneurs, many of whom would be interested in participating in our programs. Over the year, we were also connected to ~100 other people who might be interested in doing longtermism entrepreneurship, both through word-of-mouth connections and 80,000 hours. We currently have a CRM of over 400 people who might be interested in LE.

Key takeaways:

  • The survey resulted in hundreds of highly aligned, moderately entrepreneurial people interested in LE, suggesting there’s a decent talent pool for LE. The majority of respondents had early indications of entrepreneurial inclinations, but no strong track record of entrepreneurship or longtermist experience. Less than 25 had a strong track record in both longtermism and entrepreneurship.
  • We only moderately promoted the survey, and suspect we substantially undersampled.
  • No program stood out as obviously ideal for any sub-group of the respondents (which we tagged as distinct ‘user groups’ based on common traits). Users were excited about having an idea database, a Slack for longtermist entrepreneurs, and 1:1 coaching. Users reported being bottlenecked by access to funding[4] or finding time to work on ideas.

Fall Fellowship

In the fall of 2020, we piloted a part-time 10-week fellowship for 10 longtermist entrepreneurs, some of whom formed co-founder pairs.

Fellows entered without being tied to specific ideas, and spent their time exploring and testing possible ideas, working an average of 16 hours/week. They received weekly coaching, all-hands structured meetings, resources on how to find and test ideas, connections to advisors, and optional peer coaching sessions. The program cost roughly $80k ($50k in staff time and $30k in fellow stipends) and took roughly 1,300 hours of staff and advisor time to run.

At the end of the fellowship, seven[5] groups pitched for up to $100k in seed funding. They were evaluated by Claire Zabel, Jonas Vollmer, Kit Harris and Sjir Hoeijmakers. Five of the seven groups applied for funding; two groups didn’t apply. Two groups were rejected; three groups received funding offers of varying amounts:

  • One participant, Josh Jacobson, was funded to research physical-world interventions to improve productivity and safeguard life, including interventions such as improving air quality, and preparing for nuclear war or earthquakes. He received $53,000 for 6 months, with the possibility to renew for another $53,000 after 6 months of full-time work. He accepted funding and is working on this project, and he plans to help longtermists implement worthwhile interventions he identifies.
  • One team received $30k for 3 months of exploring EA field building, including outreach around EA books, with the possibility to renew for another $30k. They are unlikely to accept the funding, and decided to pursue other opportunities that gave them career capital instead.
  • One person was invited to apply for $16k in runway funding for a career transition. They are unlikely to accept the funding, and instead found a job in a related area.

Key takeaways:

  • The fellowship provided some value, but fell short of the intended outcomes. Only one person was funded.
  • The pilot validated the basic hypotheses of running a program like this, including participant interest, ability to progress on ideas, and ability to dedicate time.
  • Fellows were enthusiastic about the program. The average net promoter score was 8.
  • Funders didn’t fully trust fellows to make their ideas work after the fellowship on their own, and had a preference for shorter runways and frequent check-ins. In retrospect, we had blindspots in our enthusiasm for fellows and didn’t fully challenge them on the limits of their expertise.
  • Future iterations of the program should likely involve more working hours, have an increased focus on the social dynamic, and prioritize co-founder matching. Possible changes could include: themed fellowships, fellowships with pre-vetted ideas, or fellowships for more senior or promising fellows.
  • There’s reason to think that future programming in this area could be promising, but would require some iteration.

Matchmaking Pilot

In spring 2021, we ran a matchmaking pilot, aiming to match promising ideas with founding entrepreneurs. We picked 15 promising ideas from a list of previously generated promising startup ideas[6], and identified “idea patrons”: respected members of the EA community who would be useful advisors for the idea. We spoke with the idea patron about what was needed to get the idea started, headhunted promising entrepreneurs, and approached the entrepreneurs about their interest in starting the idea. Of the 15 ideas, only one reached the stage where the patron evaluated potential entrepreneurs. Others were suspended because patrons lost enthusiasm or the idea needed either a) more development or b) more follow-through / support than we felt we’d be able to provide (given that it wasn’t clear how long the incubator would be around for.)

Key takeaways:

  • Patron enthusiasm for a matchmaking program was high. 80% of the potential patrons we contacted were interested in working with us, although 44% decided the timing was wrong to commit now and 22% became less enthusiastic about their ideas as we discussed them in more detail.
  • Our headhunting results were more promising than we expected, although we didn’t gather a huge amount of data. Overall, we were able to identify at least some promising candidates for each idea, even those requiring specialized experience.
  • Entrepreneurs were receptive to taking on someone else’s idea. 90% of the 50 people contacted responded, 22% were happy to advise on the project, and 24% of the people applied to be the founding entrepreneur.
  • Many of the ideas seemed promising at first glance, but needed teams to spend time fleshing them out before it was clear how valuable they would be. In some cases, we concluded that an idea shouldn’t be pursued for several years, or needed a research project as a next step (rather than entrepreneurial action). There were few ideas that were obviously good, ready to be implemented, and had patron support.
  • Most patrons were most enthusiastic about a matchmaking scheme that also included ongoing support for the entrepreneur, which we weren’t able to provide.

There’s reason to think that future programming in this area would be promising. Ideas were available, patrons were enthusiastic, and entrepreneurs were receptive, however it would require more capacity to flesh out ideas and support entrepreneurs.

User Interviews

In early 2021, we conducted 40 in-depth user interviews with potential longtermist entrepreneurs, drawn primarily from the survey respondents + referrals. Our aim in user interviews was to see if there were common bottlenecks and needs among specific types of users. To do this analysis, we grouped talent by: track record of entrepreneurship, content expertise in an area of longtermism, and level of embeddedness within EA.

Key takeaways:

  • Consistent with the survey findings, there were many interested people but only a small fraction (1-3 dozen) had a track record in both entrepreneurship and longtermism.
  • The users that we were most excited about (typically because they had a few years of longtermist + entrepreneurial experience) were rare, and there weren’t enough of them to build programs around.
  • No one user group stood out as being ideal to work with. However, we thought it might be promising to work with one of the following groups:
    • Highly entrepreneurial people, usually with tech backgrounds, who were new to EA, interested in longtermism, and looking for their next startup idea
    • EA operations generalists, often with community-builder backgrounds, who would be interested in working on EA meta projects
  • People were most excited about programs that allowed them to meet peers, get ideas, and have an accountability structure.
  • Our analysis was that people were most frequently bottlenecked by not having a clear structure in which to pursue LE, and by a lack of good advisors to help them think through ideas.

Bespoke Advising

In the summer of 2020, Jade provided support to three existing longtermist start-ups. Jade ran strategy sessions on theory of change, impact metrics, and community strategy. She also offered support with hiring, introduction to mentors, and coaching, which some of the organizations took. Participants rated the programs as very useful, although Jade decided not to continue the pilot because of concerns about scalability and impact.

Key takeaways:

  • While teams appreciated the opportunity to structure their thinking, and rated the sessions as valuable, Jade thinks the program primarily accelerated discussions that would have happened anyway, and had reasonably low counterfactual impact.
  • Hiring, introduction to mentors, and coaching weren’t as popular as strategy sessions, suggesting that the teams were mostly doing fine in these areas.
  • There weren’t many organizations that Jade was aware of that she thought were both very promising and unable to find reasonable amounts of support elsewhere. She thought that even the three organizations supported via this pilot likely could have found reasonable support elsewhere, and most did/have. For this reason and concerns about scalability, the pilot was stopped. We’re not particularly excited about future programming in this area.

Lessons learned

  • It’s likely that something should exist in this space.
    • The current ecosystem for longtermist entrepreneurship is obviously lacking.
    • Pilot programs were weakly positive, suggesting programs could be useful.
    • There’s some interested talent, but not a ton.
    • We three have different views about how promising the incubation space is.
  • Talent pool is larger than expected, but less senior. Talent is likely to be a significant bottleneck. People are fairly open to being nudged towards working on specific ideas.
    • There are very few people with longtermist & entrepreneurial experience (e.g., 2-3 years experience in both) that we trust to execute ambitious projects in specific areas of longtermism (bio, AI, etc.).
    • There are hundreds of junior people interested in doing something in LE.
    • People were relatively open to being nudged towards LE. Within LE, they were open to being nudged towards specific ideas (for better or worse).
    • There was no particular reason to think that the talent pipeline issues would significantly resolve themselves in the coming years.
  • The idea pool is more limited and less developed than we expected. There are existing lists of ideas, but almost no ideas are fleshed out and have broad support.
    • After speaking to experts in areas of longtermism, it became clear that there were no 'highest-priority ideas' that are obviously good to pursue — instead, most felt that the most promising ideas depended on available talent.
    • Very little thinking has been done to identify and thoroughly flesh out & vet orgs or projects that need to be started in LT.
    • We found almost no longtermist ideas for traditional startup-minded people to pursue (product-based with quick path to scale, feedback loops), and don’t expect this will be the majority of promising longtermist startups.
  • Attention should be focused on starting new orgs and projects rather than supporting existing ones
    • There aren’t a lot of longtermist startups, and many of the most promising ones are already able to access networks and support they need. It’s also difficult to scale one-on-one advising.
  • We can do more to start new orgs and projects. The fellowship and matchmaking pilots were promising.
    • The fall fellowship was promising. People were interested in participating and able to progress on ideas. There’s room to iterate, e.g. making it themed, longer, or with specific talent backgrounds.
    • Matchmaking work also seems promising. 80% of patrons and 20% of candidates contacted in our pilot were interested in being involved.
  • Funders are worried about downside risks of new projects, but often more open to funding short-runway projects with frequent checkpoints.
    • Relative to traditional startup space, there’s a much bigger focus on downside risk. Funders do want to see more ambitious new projects, but in cases where there’s potential risks of doing harm some funders will be hesitant to support new organizations or untested ideas without supervision. This could involve shorter runways with frequent check-ins and more active oversight.
    • In some cases funders may be more open to taking risks with new projects if the founders are unusually well-aware of risks, have active trusted advisors, and test out risky actions in small-scale, temporary ways.
    • Funders are generally open to funding relatively low-ambition projects, if they’re high impact and needed. There’s lots of valuable things to do in the meta space that may not scale to a full organization or startup. It’s much harder, although likely much more valuable, to get agreement and talent on more ambitious, risky projects.
  • LE Project should be divided into several different organizations
    • The LE Project tried to cover a broad mandate that likely should be split into several different groups, run by orgs with different skill sets and priorities.
    • We’d be pretty skeptical about a group trying to do broad LE incubation again, and more excited about groups that have a more precise vision for a specific program that they’re well-suited to run in this space.

What we’d be excited about

We could imagine future programs would be valuable, provided they’re started by someone qualified (see more on this below). In particular, we’d be most excited to see:

  • Longtermist-leaning EA Meta incubator, specifically for medium-sized meta projects centered on improving and growing the EA + longtermist community (likely talent programs, rather than object-level projects in AI, bio, etc.). We could see a range of programs that would be useful in this space, including fellowship, matchmaking, active grantmaking, etc. We think there’s likely a significant amount of low-hanging fruit here, and a decent talent pool interested in working on medium-sized EA meta projects. See Open Phil’s RFP for a list of some suggestions in this space. For any project in this space, we think it would be important to be clear about drawing a distinction between EA Meta work writ large, and longtermist leaning EA Meta work.
  • Foundry / idea-first programming doing idea generation and launching of specific organizations. We’d be particularly excited to see specialized foundries focusing on one cause area, run by a team with background in that area (e.g. AI safety, biorisk, longtermist policy) — we expect these areas to be very difficult to operate in, and potentially difficult to get funded, but likely very valuable. It’s possible that fellowships with pre-vetted ideas would work well in combination with a foundry.
  • Matchmaking / active grantmaking programs that build on the success of the pilot. Would be an obvious fit for a funder but could be done by a new organization or in combination with a foundry or an EA meta incubator.
  • A community for entrepreneurs, making it easier to find other entrepreneurs to exchange ideas with, helping entrepreneurs not bounce out of EA, providing a way to network with advisors and co-founders, etc.

Incubation aside, we also see the need for:

  • More research on newer areas of longtermism (e.g. improving institutional decision-making, civilizational resilience) such that we can get to the point of having more concrete, fleshed out views of what ideas we might want to prioritize in these spaces.
  • More talent efforts, particularly those focused on finding or creating entrepreneurs with significant content expertise in areas of longtermism (e.g., PhDs in bio or AI interested in starting companies), since this is one of the bottlenecks for most ambitious entrepreneurship.

What we’d be concerned about

What plausible ideas probably shouldn’t exist in this space?

  • Traditional Y Combinator (YC) for EA / longtermism. We often hear the suggestion that EA / longtermism needs a YC or traditional startup programs. This seems wrong to us due to significant differences between the tech sector and LE. We’re pessimistic about programs that seek to emulate startup incubators and more optimistic about programmes which are highly specialized. Relative to the traditional startup space:
    • The talent pool size is much smaller (only 100s of potential longtermist entrepreneurs in total vs tens of thousands of entrepreneurs).
    • There are very few existing longtermist startups (<20), so evaluating existing teams and ideas isn’t likely to be valuable. There are also very few people launching longtermist startups on their own.
    • The method for finding startup ideas is very different (much more about research, strategic thinking, forecasting long-run consequences, and introspection than finding product-market fit).
    • The risk appetite, particularly of early-stage funders, is very different: There is very little downside risk in startups (lose your investment) vs huge downside risk for LE (harm that could be equal to or greater than the upside).
    • These all suggest that a much more cautious, specialized approach makes sense.
  • A generic longtermist incubator. For reasons described above, we’re skeptical about a new LE incubator that aims to cover this entire area, rather than focusing on a smaller slice of this space.
  • An incubator starting in AI, bio, etc that is led by a team without experience in that particular area. See more on team fit below.

For any future longtermist entrepreneurship project, we’re particularly worried about the following two risks:

  • Encouraging people to do ambitious projects that they aren’t able to execute well
    • In addition to risks of doing harm, there’s risks of “poisoning the well,” and discouraging people from future efforts in that area.
    • We think most funders are pretty cautious about this, so this is particularly a concern for people who are self-funding or working with less established funders.
  • Incubator staff encouraging bad ideas, or encouraging entrepreneurs to work on ideas they don’t understand
    • Given how unclear the space is, and the downside risks, it’s important that anyone running an incubator thinks through the ideas that they’re supporting themselves, rather than outsourcing that judgment to cause-area experts. Almost no ideas are good “in a vacuum” — the impact depends on the entrepreneurs who would implement the idea, so getting expert advice is helpful but not sufficient. Incubator staff are the only ones well-positioned to understand the interaction between ideas and the people who apply to work on them.

What it takes to work in this space

We think this space is particularly difficult to work in, and we expect there are very few people in the longtermist community with the right background and orientation to run incubators. In retrospect, we’re unsure we had the right background as a team to succeed, and think future team experience is important for success.

The right team to run another longtermist incubator would likely need to have:

  • Experience with starting and advising startups and organizations
    • Experience founding orgs, and advising early-stage startups and organizations.
    • Entrepreneurial grit: fast-paced, able to operate under uncertainty, experience iterating and piloting.
  • Strong longtermist knowledge and intuitions
    • Familiar with longtermism content areas (AI, bio, etc.). Opinions on what organizations should be started, understanding of the current state of the work, familiar with the risks and ongoing projects of each.
  • Strong EA commitment, network, and reputation
    • Has a strong EA network and reputation, including with EA funders.
    • Familiar with EA thinking and history, aware of what projects have been tried in the past. Familiar with EA social norms and organizations.
    • Strong value alignment, deeply motivated by pursuing a better long-term future, and a stronger longtermist ecosystem in particular.
  • Demonstration of other key traits
    • Commitment & conviction to the space.
    • Rigorous thinking, interest in navigating complex intellectual spaces.
    • Friendly, sociable, good on teams, able to network and work well with others.

Don’t get confused by the hype: In addition, we think it’s important that the founding team isn’t confused by the hype around LE incubation. The incubation space can seem particularly exciting relative to other possible LE orgs to start. We think this is a bit risky, and might lead people to work in the space who don’t have appropriate backgrounds, or are motivated by a false sense of excitement.

  • The work can be unexpectedly difficult or slow-going. It’s difficult to find a clear user group, and there are tradeoffs between running high-value programs now, and programs that would be sustainable to run for many years.
  • Many of the programs that need to be started won’t be exciting, high-ambition, or able to scale rapidly. The hype of an incubator likely won’t map onto the reality of the programs being started.

What next

We’re grateful for the support of Claire Zabel and the Open Philanthropy Project, the fiscal sponsorship of the Centre for Effective Altruism, and the numerous advisors, users, and program participants who were willing to give their time to us despite the roughness of our programs. We hope that the lessons learned from your time will continue to be valuable even though our program has ended.

If you’re considering starting an LE incubator[7], we’d love to hear about it so we can offer advice and coordination with others interested in working in this space. Please fill out this google form if you’re interested in founding programs in LE incubation. We’ll follow up with most people who express serious interest in starting an incubation program to offer connections and advice, and to learn more about your plans.

There are a few groups currently exploring doing work in this area. We haven’t closely evaluated any of their work, but if you’re interested in following similar work it might make sense to reach out to the following projects:

  • Charity Entrepreneurship: CE has a history of successfully supporting entrepreneurs to launch highly effective charities. They began supporting EA meta charities this year.
  • EA Funds: Jonas Vollmer is considering doing some sort of EA meta incubator as part of EA Funds, though he's not sure yet how likely this is to happen.
  • WANBAM Mentorship: WANBAM is piloting new mentorship rounds starting at the end of 2021. It’s quite likely that they would be looking to facilitate peer support and mentorship for subsets of EA entrepreneurs (regardless of gender); please reach out to eamentorshipprogram@gmail.com if you’re interested in this.
  • Direction-setting and idea generation workshops: Owen Cotton-Barratt, Rebecca Kagan, and Damon Binder are organizing workshops aimed at brainstorming what we’d like the world to be like on a 10-20 year timescale, and what this means for projects that we’d like to see launched today. They expect to put out posts on the EA Forum about this work in the coming weeks.

A longer retrospective on the incubator with private details is also available upon request, as well as many detailed retrospectives on our individual programs. Please reach out to imbenclifford@gmail.com if you think seeing these documents would be useful for your work.


  1. This survey is for people interested in incubation specifically, separate from “longtermist entrepreneurship” — namely, starting organizations that will help support multiple longtermist startups. Throughout this post, we use the phrase “LE Incubation” or “incubation” to refer to meta projects supporting the creation of more longtermist startups. ↩︎

  2. An additional $129,000 was earmarked for funding fellows, although we don’t expect all of it to be distributed. ↩︎

  3. This estimate doesn’t count participant time, including fellows, both during the programs and if funded afterwards. ↩︎

  4. This finding was mostly not supported by our user interviews, the majority of whom said they weren’t funding constrained. However, some users said they didn’t know how to access funding if they weren’t already well networked. This gave us a preliminary sense that, rather than being funding constrained, some users feel limited in their ability to access funding. Further exploration could be valuable. ↩︎

  5. Of the 10 participants, one dropped out (leaving nine fellows). Of the nine fellows, two sets formed co-founder pairs, leaving seven groups of founders (five individuals, two pairs). ↩︎

  6. Ideas were chosen from a previously compiled meta list of ideas, which pulled from several published and private lists of possible ideas. Most ideas were not carefully vetted, and had between one sentence and a few paragraphs of explanation. The 15 ideas were selected based on being potentially promising to implement now, having an “idea patron” and not requiring a huge amount of specific background knowledge either on the part of the incubation team or the founding entrepreneur. ↩︎

  7. Incubation specifically, separate from “longtermist entrepreneurship” — namely, starting organizations that will help support multiple longtermist startups. ↩︎

Comments28
Sorted by Click to highlight new comments since: Today at 4:15 AM

Note: This message came out of a conversation with u/AppliedDivinityStudies and therefore contains a mix of opinions from the two of us, even though I use "I" throughout. All mistakes can be attributed to me (An1lam) though.

Really appreciate you all running this program and writing this up! That said, I disagree with a number of the conclusions in the write-up and worry that if neither I nor anyone else speak up with our criticisms, people will get the (in my opinion) wrong idea about bottlenecks to more longtermist entrepreneurship.


At a high level, many of my criticisms stem from my sense that the program didn't lean in to the "entrepreneurship" component that hard and as a result ended up looking a lot like typical EA activities (nothing wrong with typical EA activities). 


First, I strongly disagree with the implicit conclusion that fostering a LE requires lots of existing LE entrepreneurs, specifically:


Hundreds of people expressed interest in doing LE, but a very small number of these (1-3 dozen) had backgrounds in both longtermism and entrepreneurship. There were few people that we thought could pull off very ambitious projects.

And also:

Talent pool is larger than expected, but less senior.


If there existed a large pool of LE entrepreneurs with the right skills, there'd be a less pressing need for this sort of program.  I get that you're wary of analogies to tech startups due to downside risk but to the degree one wants to foster an ecosystem, taking a risk on at least some more junior people seems pretty necessary. Even within the EA ecosystem, my sense is that people who founded successful orgs often hadn't done this before. E.g., as far as I know Nick Bostrom hadn't founded an FHI 0.0 before founding the current instantiation of FHI. Same for GiveWell, CEA, etc. Given that, the notion that doing LE entrepreneurship requires "backgrounds in both longtermism and entrepeneurship" seems like too restrictive a filter.


Second, without examples it's a little hard to discuss, but I feel like the concern about downside risk is real but overblown. It's definitely an important difference between LE entrepeneurship and traditional startups to be mindful of but I question whether it's being used to justify an extreme form of the precautionary principle that says funders shouldn't fund ideas with downside risks instead of the, more reasonable IMO, principle of funding +EV things or trying to ensure the portfolio of projects has +EV. 


Third, I think some of the assumptions about what types of activities should take precedence for LE entrepreneurship deserve re-examining. As I alluded to above, it seems like the activities you say matter most for LE entrepeneurship, "research, strategic thinking, forecasting long-run consequences, and introspection [rather] than finding product-market fit" are suspiciously similar to "typical EA activities". From my perspective, it could instead be interesting to try and take some of the startup gospel around iteration, getting things out into the wild sooner rather than later, etc. seriously and adapt them to LE entrepreneurship rather than starting from the (appearance of the) assumption that you have very little to learn from that world. This isn't fully charitable, but I have the sense that EA has a lot of people who gravitate towards talking/strategizing/coordinating and getting other people to do things but sometimes shy away from "actually doing things" themselves. I view an LE entrepreneurship incubator as an opportunity to reward or push more people towards the "actually doing things" part. Part of this may also be that I'm a bit confused about where the boundary between normal and LE entrepreneurship lies. In my mind, SpaceX, fusion startups, psychedelics research would all qualify as examples of LE entrepreneurship with limited downside risk or at least not existential downside risks. Would you agree that these qualify as good examples?


Fourth, you mention advisors but only say a few by name. I'm 1) curious whether any of these advisors were experienced entrepreneurs and 2) interested in whether you considered getting advisors only adjacent to EA but very experienced entrepreneurs. As an example, at least one founder of Wave is an EA-aligned successful entrepreneurs who I can only imagine has wisdom to impart about entrepreneurship. I don't live in the Bay Area but I have the sense that there are quite a few other EA-adjacent founders there who might also be interested in advising a program like this.


Fifth, this is more low-level but I still don't really understand the skepticism of a YC-like incubator for LE entrepreneurship. It seems like your arguments boil down to 1) the current pool is small and 2) the requirements are different. But on 1, when YC started, the pool of entrepreneurs was smaller too! Such a program can help to increase the size of that pool. On 2, I agree that a literal copy of YC would have the issues you describe but I'd imagine a YC-like program blending the two community's thinking styles in a way that gets most of the benefits of each while avoiding the downsides. As an aside, we are also very supportive of longtermists doing YC but for slightly different reasons. This may also be related to the confusion about what qualifies as LEE.


Summarizing, my goal in writing this comment is not to just criticize the program. Instead, I worry that by highlighting the need for experience and the overwhelming risk of harm, the write-up as-is might discourage would-be LE entrepreneurs from trying something . I hope that my comment can help provide a counterweight to that.

FWIW, as someone who previously warned about risk of accidental harm, I personally mostly agree with this comment. I think what I care more is "option value to shut projects down if they turn out to be harmful" than preventing damage in the first place (with the exception of projects that have very large negative effects from the very beginning).

I think offering funding & advice causes more people to work with you, and the closer they are working with you, the larger the influence your opinion is likely to have on the question of whether they should shut down their project.

Thanks for the comment An1lam, and apologies for the delay! Really appreciate the engagement. 

Some thoughts in response: 

On (1), I agree that fostering an ecosystem doesn’t require a load of experienced people; you’re very much right that that’s the whole problem that we’re trying to solve in the first place! What we mostly mean to say in terms of pointing to the lack of requisite experience is that we found that there weren’t that many individuals who are ready to hit the ground running with founding a substantial longtermist start-up right now; hence a lot of LE activities that we would have been excited about pursuing instead (and which are mentioned in the “what we’d be excited about” section) are targeted at making up for this lack of experience by e.g. fostering a community, bringing folks in-house in a foundry. 

On (2), as you say, it’s difficult to talk about this in the abstract, and certainly I feel sympathetic to the worry that we might be being too cautious (that was one of the motivations for me starting this project in the first place). With that said, I do think thinking thoroughly through downside risk considerations and treating them more seriously than is the norm among founders writ large is something that I want to encourage of all longtermist entrepreneurs, and I’d worry about not doing it enough more than I’d worry about doing it too much given what’s at stake. The thing that I’m interested in advocating for here, though, is something like demonstration of thought, care, and consideration, rather than avoiding projects with downside risk altogether (I think the two often are conflated); for example, if it turns out that after a reasonable amount of thinking there is some amount of downside risk, but the EV of the project is still high & the founder has identified reasonable things they could do to mitigate this downside risk, I (and I imagine most EA funders) would be supportive of that project going ahead.

On (3), seems right that EAs in general, and particularly folks trying to start new projects, have a fair amount that they could learn from ‘traditional’ start-up best practice (that was a big part of the framing that we took in designing curricula materials for the fellowship, for example). I’m not sure where we disagree here; I do also think that strategic thinking and introspection are important, and perhaps uniquely so for start-ups which don’t follow the usual product-fit dynamic (although I think several do, even something like a new research org), or where there are very slow feedback loops to use in the interim, hence why it might be unusually important for longtermist entrepreneurs to have this trait. But I don’t think thinking this is important is mutually exclusive to thinking that learning from more mainstream start-up practice is useful. 

On (4), advisors were a combination of experienced entrepreneurs (some of whom also happened to be quite engaged with EA, some less so) and domain experts. 

On (5), basically Jonas’ comment captures the main thing - that given our epistemic state in longtermist cause areas at the moment, it just seems quite difficult to come up with start-up ideas that fit the typical YC model (find product-market fit, then scale), which means that the idea generation process I think needs to be quite a bit more involved, be advised by domain experts, etc. and the build-iterate cycle necessarily needs to be tweaked from the usual dynamic expected for e.g. software products. I.e. mostly leaning on (2) of ‘the requirements are different’ rather than the pool being small. 

Two additional high level things that come to mind in response: 

1 - We definitely don't want to discourage promising longtermist entrepreneurs who are excited about starting something in this space, the intent was to clarify for future potential incubator founders what we tried and learned. 

2 - In terms of your overall concern that the program wasn't entrepreneurial enough, it's a bit hard to comment. At the end of the day, we do think that simply replicating traditional entrepreneurship in this space won't quite work, but also do think that the EA community could learn a ton from the entrepreneurship community (and designed our program with both of those in mind). It's possible you disagree, or possible that the tone and seriousness with which we centered entrepreneurship didn't come through in this post. 

Hey Jade,

Thanks so much for your reply. This actually really helped clarify things for me. I think we may still have some different priors about (2) but overall your comment made me think we agree much more than we disagree (and than I'd previously thought we'd disagreed). 

I again just want to note that I'm grateful you ran the program and engaged so productively with my comment.

Really glad to hear, and in turn, really appreciate your engagement with the post! 

Regarding a YC incubator model, I think the main issue is just that people rarely generate sufficiently well-targeted and ambitious startup ideas. I really don't think we need another dozen donation apps or fundraising orgs, but that's what people often come up with. I think we'd want something that does more to help people develop better ideas. (Perhaps that's what you had in mind as well.)

Honestly I wasn't too sure what the biggest issue was but what you described seems reasonable to me! 

A quick thought  on having a YC-style programme and taking risks on more junior talent:

Domain expertise is important - I think YC would agree on this. If taking on a deep tech startup they would look for someone on the team who had domain expertise in the field.

I think early YC Internet startups like Dropbox or Airbnb make it look like domain expertise is less important and it’s more about just getting stuck in. The difference is that when Dropbox started there was no expert in “files on the internet” so the founders could basically become the world experts just by getting stuck in and working on it.

The difference with Longtermist areas like AI and Bio is you can’t just become the expert by working on it and (yes, you guessed it) the downside risk means we don’t want to take bets on people to just go for it and try it out (unlike Dropbox where it doesn’t really matter if it fails catastrophically).

Very interesting, valuable, and  thorough overview!

I notice you mentioned providing grants of 30k and 16k that were or are likely to be turned down. Do you think this might have been due to the amounts of funding? Might levels of funding an order of magnitude higher have caused a change in preferences? 

Given the amount of funding in longtermist EA, if a project is valuable, I wonder if amounts closer to that level might be warranted. Obviously the project only had 300k in funding, so that level of funding might not have been practical here.  However, from the perspective of EA longtermist funding as a whole, routinely giving away this level of funding for projects would be practical.
 

Hey Michael! I don’t know if more money would have changed their decisions, but I want to clarify that the funding panel wasn’t funding constrained (we actually had more than $300k set aside for this), and funders didn’t make the decision with that as a limitation.

The cases aren’t actually that similar — in one, the funding panel gave a low amount to discourage the individual from pursuing the idea and support a career transition, in the other they gave the individuals more than requested — but in both cases the uncertainty of what the people would do was the key cause in giving a relatively small amount of money, not being funding constrained.

If you don't want someone to do something,  makes sense not to offer a large amount of $. For the second case, I'm a bit confused by this statement:

"the uncertainty of what the people would do was the key cause in giving a relatively small amount of money"

What do you mean here? That you were uncertain in which path was best?
 

"the uncertainty of what the people would do" -->

Both groups were being funded for open-ended plans (in one case, a career transition, in the other "exploring EA field-building"), rather than a specific venture, hence the uncertainty.

"If you don't want someone to do something" -->

This isn't the case -- if the funders hadn't wanted the recipients to move forward, they wouldn't have given funding. In that case, the funder offered to support a different plan than the one that was originally pitched, namely instead of a venture, a career transition.

I just discovered this thread, but figured I'd back up some data points! I've been in the EA community since 2016-ish and done entrepreneurship and longtermist work for about 2 years. I won't lean too much into my experience, I'll generally say things other entrepreneurs and impact-minded people would agree with.

Re: Founder pool

The talent pool size is much smaller (only 100s of potential longtermist entrepreneurs in total vs tens of thousands of entrepreneurs).

It is helpful to think of "longtermism" and "entrepreneurship" as two entirely distinct, and very uncommon skillsets/mental frameworks. Currently, way less than 1% of the world would identify as longtermist, and less than 1% would identify as entrepreneurs. While these traits are slightly correlated, the vast majority of lontermists are not entrepreneurs, and the vast majority of entrepreneurs are not longtermist.

That's totally fine, but makes things difficult for a longtermist incubator. I mean, if Y Combinator started prioritising longtermist involvement, I'm pretty sure they'd have similar challenges.

Re: Project pool

We found almost no longtermist ideas for traditional startup-minded people to pursue (product-based with quick path to scale, feedback loops), and don’t expect this will be the majority of promising longtermist startups.

Again, overlap is hard. One thing Y Combinator constantly emphasises is that most startups never achieve product-market fit. I.e. it's really, really hard to build something people want to pay for. So the reliable solution is to try and iterate a lot with fairly little information. A 10% success rate is really good if you try a dozen times a month.

So just building anything successful is hard. Then you have to try and make something impactful, which is another, completely separate hard thing to do. Most successful, lucrative companies are not "effective" or "impactful" in the longtermist sense. I've tried this thought exercise myself, sometimes I would think of/test like 9 potential lucrative startup ideas before I find one that's lucrative and also possibly impactful.

Anyway, point is: This is a difficult thing to do, essentially trying to find a very narrow eligible pool and get them to work on a very small set of possible ideas. I really appreciate that this was tried!

Given the constraints highlighted above, It seems like a venture builder model (focussed on a specific cause area) may be more effective, wherein the following process is repeated:

 (1) Generate plausible venture ideas from existing research within EA orgs

(2) Analyze ideas on two dimensions - (a) Cost benefit Analysis (b) Operational feasibility

(3) Incubate and recruit EA aligned technical and non technical co-founders (who then build their own team)

(4) Tie further funding and possibly  bonuses to specific short term milestones

 

  • It seems like EA orgs with existing research capabilities are best suited to support  above steps #1 and #2a . This is essentially a high level analysis of the expected value of building this company, a rough estimate of costs (in orders of magnitude) which helps us quickly come to a "Go" /"No go" decision.
  • I think step 2 (b) needs to be outsourced to technical consultants or economic consultants (think Analysis Group, Brattle group etc)  who can conduct feasibility analyses or more accurately estimate lifetime costs. Let's say we wanted to produce and distribute some state of the art PPE equipment. We'd need to have a rough understanding of  things like  supply and demand dynamics of raw materials, regulations around PPE equipment, legal risk etc. Answers to these questions could determine whether it's worth doing something even if the high level cost benefit analysis was positive. These also don't seem like the type of questions orgs like Open Phil routinely answer. 
  • Step 4 could potentially unlock some talent constraints and allow founders to recruit a management team or lead scientists that may not be completely EA aligned or long termist but are still incentivised economically to help move things along.

Everything above is of course contingent on the ability to actually generate actionable ideas that pass Step 2(a) in a particular cause area. 

Very interesting read, thanks for publishing this!

I am curious what qualified as "having longtermist experience" for you?

Glad to hear!

Roughly this would mean having worked in a relevant area (e.g. bio, AI safety) for at least 1 - 2 years and able to contribute in some capacity to that field. To be clear, some ideas would require a lot more experience - this is just a rough proxy.

Is there a list of the ideas that the fellows were working on? I'd be curious. 

It's not surprising to me that there aren't many "product focused" traditional startup style ideas in the longtermist space, but what does that leave? Are most of the potential organisations research focused? Or are there some other classes of organisation that could be founded? (Maybe this is a lack of imagination on my part!)

Hi Rory, thanks for the comment! We haven’t published those ideas. In terms of classes of organisation, one way to carve up the space is to think about Object-level and Meta-level approaches to generating ideas.

Object-level approaches focus on doing direct work to solve the problem at hand. For example:

  • developing and deploying technologies
  • conducting research
  • advocating for policy change

The main type of impact here comes in the form of tangible changes in actions taken in the real world, in whatever form that might take.

Meta-level approaches focus on improving the capacity for others to solve the problem. This can be done on the EA/longtermist wide-level (building up the movements) or in a specific domain, e.g. building a talent pipeline specifically for bio policy experts. Concrete types of meta work include, for example:

  • community and field building
  • the dissemination of ideas and knowledge and values
  • increasing the resources available to work on object-level approaches

The main type of impact here comes in the form of the change in likelihood that object-level approaches will be impactful.

Hope that's useful!

I'd be interested to hear more about the thoughts behind this key lesson:

The LE Project should be divided into several different organizations/efforts, rather than one incubator.

This makes sense to me in light of the different tasks and operational requirements that different purposes are likely to require, but I noticed a theme running through the rest of the report of uncertainty. This included things like funders being uncertain of downside risk; uncertainty about what actions a person with funding should take; an expertise/experience bottleneck for longtermism and especially the combination of longtermism and entrepreneurship.

What do you think of the relationship between the constraints and having multiple orgs in the space?

Hi Ryan, I may be misunderstanding the question so correct me if I'm wrong - are you saying something like: "given that there's lots of uncertainty about what's needed this seems in tension with starting an organisation that concentrates on only one user type (e.g. recent generalist graduate) or one domain (e.g. AI Safety)"?

Is the slack or other community  resources still being used / are they still available for additional people to join?

The Slack used for the fellowship is no longer being used

There are very few people with longtermist & entrepreneurial experience (e.g., 2-3 years experience in both) that we trust to execute ambitious projects in specific areas of longtermism (bio, AI, etc.).

 

Do you have any reflections or recommendations about what people who meet one but not both of these criteria could be doing to become great potential LEs? I appreciate that there is an obvious answer along the lines of "try the other one out!" but I'm wondering if you have any specific suggestions beyond that.

I.e. 

What could people with longtermist experience but negligible entrepreneurship experience be doing to bridge that gap? Are there any specific resources (books, articles, courses, internships, etc) you'd recommend for people to start testing their personal fit with this and building relevant skills?

And the same question again for people with entrepreneurship experience but negligible longtermist experience.

(Further also to hrosspet's question, I'd be interested in roughly how you were defining/conceptualising those two categories, and if you have general comments about the ways in which people tended to insufficiently developed in one or the other.)

Thanks for this post. It's great to see the writeup to be able to learn from the experience, even though it didn't work out for you guys in this iteration of the idea.

I sense a slight potential tension between the comment that "EA operations generalists, often with community-builder backgrounds, who would be interested in working on EA meta projects" seem like a promising group to work with and the comment that "There are very few people with longtermist & entrepreneurial experience (e.g., 2-3 years experience in both) that we trust to execute ambitious projects in specific areas of longtermism (bio, AI, etc.)." I would imagine that the former group would tend to not have much experience in "specific areas of longtermism". I'd love any clarity you can shed on this:

  • Am I just wrong? I.e. do some/many of these people have substantial  experience in specific areas?
  • Is it that you see this group as being promising specifically for various meta projects that don't require deep expertise in any one area?
  • Is it that you think that this gap could potentially be bridged as part of a longtermist entrepreneurship incubator's role, e.g. by getting promising-seeming potential future LEs placed into jobs where they can build some domain specific knowledge before revisiting the idea of LE, or some such?
  • Something else?

Hey Jamie - Ben Clifford here, thanks for flagging this.

I think your second bullet captures the idea well. I don’t think being good at EA community building and associated ideas requires deep domain expertise in areas like AI or Bio.

There would be an argument for thinking about bullet 3 as well but it wasn’t what I was thinking.

Thanks for this post! Reading through these lessons has been really informative. I have a few more questions that I'd love to hear your thinking on:

1) Why did you choose to run the fellowship as a part-time rather than full-time program?

2) Are there any particular reasons why fellowship participants tended to pursue non-venture projects?

3) Throughout your efforts, were you optimizing for project success or project volume, or were you instead focused on gathering data on the incubator space?

4) Do you consider the longtermist incubation space to be distinct from the x-risk reduction incubation space?

5) Was there a reason you didn't have a public online presence, or was it just not a priority?

Thanks, great questions! In response: 

1) How come you choose to run the fellowship as a part-time rather than full-time program?

We wanted to test some version of this quickly, part time meant:

  • It was easier to get a cohort of people to commit at short notice as they could participate alongside other commitments
  • We could deliver a reasonable quality stripped back programme in a short space of time and had more capacity to test other ideas at the same time

With that said, if we were to run it again, we almost certainly would have explored running a full-time program for the next iteration. 

2) Are there any particular reasons why fellowship participants tended to pursue non-venture projects?

Do you mean non-profits rather than for-profits? If so, I think this is because nonprofits present the most obvious neglected opportunities for doing good. Participants did consider some for profit ideas.

3) Throughout your efforts, were you optimizing for project success or project volume, or were you instead focused on gathering data on the incubator space?

The latter - we were trying to learn rather than optimise for early success.

4) Do you consider the longtermist incubation space to be distinct from the x-risk reduction incubation space?

Yes, mostly insofar as the Longtermist space is broader than the x-risk space - there are ideas that might help the long term future or reduce s-risk without reducing x-risk.

5) Was there a reason you didn't have a public online presence?

I think having an online presence that is careful about how this work is described (e.g. not overhyping entrepreneurship or encouraging any particular version of it) is important and therefore quite a bit of work. We felt we could be productive without one for the time we were working on the project so decided to deprioritise it. If we had continued to work on the project, we would have spent time on this.

Curated and popular this week
Relevant opportunities