Hide table of contents

Let's say you run a non-profit, and you and some of your co-workers are there for EA reasons. The EA Forum is going to be hosting a Marginal Funding Week and you're trying to decide whether to post an appeal. How do you decide whether you're ready to raise funds from the EA community?

At a high level, I think you should go ahead if you can explain what you'd do with the money and are willing to share the details that will let people determine if your overall case is strong enough. As a community I think we should generally have higher standards for projects that have been running longer, and for ones trying to raise larger amounts of money.

New projects, both in the for-profit and non-profit world, generally get off the ground with the engagement of a small number of funders who are comfortable with the risk-reward tradeoffs of early-stage work. Sometimes these funders are highly engaged and provide advice and connections, other times they're giving some start-up funds and hoping it works out, but either way they're taking a substantial risk of failure on each bet in the hope of getting some hits.

In the for-profit world societies worry that most people not being sufficiently sophisticated to make this kind of investment, and generally draw some sort of line between accredited investors (who can be assumed to know the risks they're taking with early-stage ventures) and the rest of us (who might be dazzled into putting our life savings into a scam). To sell your stock to the general public you need to first disclose a lot of information about your business: detailed financial statements, risks, what you'll do with the money, etc.

The non-profit world is pretty different: while you do have to make some limited information public, the disclosure requirements are relatively minimal. There's no obligation to share facts that a reasonable donor would want to consider.

While I wouldn't advocate extending public-company-level regulation to the non-profit world, this is a place where the EA community has historically tried to shift norms in the direction of more transparency, and I think we should continue to do this:

  • One of the biggest flaws of the non-profit approach is how much weaker the feedback loops are than in for-profit work. There are many ways altruistically-motivated work can become disconnected from the ends it's intended to advance. A culture of sharing and critically evaluating this information can help keep organizations focused, and keep the money going to the organizations that can best apply it.

  • If we become a community where people can raise substantial funds by simply saying they're EAs working on an important problem we'll end up with lots of people who make the right noises but don't actually put funds to good use.

  • Even if most people reading an appeal don't think hard about whether it all checks out, having this information out there allows dedicated independent people to make comparisons and share their results. This amplifies both effects above.

So I like the for-profit approach as a model. Early in the life of your project you have a small number of high-context funders where you can put time into each funding relationship. As you scale, you "go public" and start also raising money from people you're not going to have conversations with. When taking that step I would like to see orgs generally providing details about what people can expect if they give you money.

The things I'd most like to see in public funding requests are:

  • What will you do with the money? Not just what do you do in general, but what will you be able to do if you raise these additional funds. Note that describing the work you'd do on the margin means that they work you're describing is, well, marginal. If instead you made the case for your strongest work, the work that you'd only cut if your funding dropped off dramatically, this would lead donors to overestimate the benefit of funding you and so allocate their funds less well.

  • What's your track record? Funding early-stage projects typically involves a good amount of evaluating the team as people, figuring out how much you trust them to take your money in a new direction. But if you're "going public" this kind of interpersonal evaluation breaks down: you can't have dozens of EAs on the Forum booking slots on your calendar to get a sense of whether you seem to know your stuff and be the kind of person who will execute well on your vision. Instead, point at concrete accomplishments.

  • What is your financial situation? What are your main expenses and sources of income? Anything people wouldn't expect? How much money do you currently have? How long is your runway?

Additionally, it's pretty valuable to also share:

  • Why is this worth doing? While this isn't information donors can get only from you it's probably something you're especially well positioned (and incentivized) to provide. What problem are you addressing? Why is it important? How will success at your planned efforts improve the situation?

  • Where does your work fit in? What are the other organizations trying to solve this problem? Why are you taking your specific approach? Are some of these other groups a better fit for donors with certain outlooks or values?

  • What are the key risks? What are the most likely reasons you might fail to make progress? Could your work actually be harmful? If there are major uncertainties on your path to impact, what are they and how are you addressing them?

  • What's your longer-term model? Are you hoping to be philanthropically funded indefinitely? Is this funding that will get you to a place where you can instead convince governments, private companies, or consumers to fund you or others to do this work? Is this the kind of project that needs long-term investment to be worth it?

  • How can people tell if you're succeeding? If you're posting again in Fall 2025 saying that you had a great year and asking for money for 2026, how will we be able to tell whether you actually had a great year? This is closely related to "what will you do with the money?", but instead of focusing on impact it's focusing on evaluation: how will you and others be able to tell if your efforts are working. Are there specific milestones you expect to hit? Measurable outcomes we'll be able to observe?

If you're not ok including this information in your funding request, or at least answering these and similar questions as they come up in the comments, then it's worth considering whether you're in a good position to solicit funds from the community.

Another consideration in making a public request for funding is that by putting your org out there like this you're opening yourself up to more criticism. Asking the EA community for funding is, in some sense, quite audacious: it's a claim that your organization is one of the very best ways to turn money into a better world. That's a high bar and the EA community can be a critical group! I think on balance EA's critical outlook is positive: if I make what I think is a solid and relatively complete case for my work, and other people who've thought hard about how to make the world better don't think it measures up, that certainly hurts, but it's an important check. The history of non-profit work includes many people who've overestimated the value of their work and would have been able to have much more impact if they'd taken a different approach.

On the other hand, it's easier to criticize than than do, and it's important to nurture transparency by recognizing when people are sharing information they could have kept internal. It's important to recognize that there are real people with feelings behind each organization, who in many cases have poured a substantial portion of themselves into these vessels for positive change. We need the critical side of our culture to keep us focused on impact, but we need to balance it with empathy, kindness, and a sense that we're on one big team pulling together.

Comment via: facebook, mastodon

27

4
2
2

Reactions

4
2
2

More posts like this

Comments9
Sorted by Click to highlight new comments since:

So I like the for-profit approach as a model. Early in the life of your project you have a small number of high-context funders where you can put time into each funding relationship. As you scale, you "go public" and start also raising money from people you're not going to have conversations with. 

I think that model works well in some circumstances, and certain appreciate the logic behind extending it to the non-profit world when that is the case. However, it's not the case that every potential founder or org has access to a "small number of high-context funders" who are in a position to support the early stages of the project without a public appeal. That means some of them are going to need to go public in a less developed state than would perhaps be ideal. Ability to self-fund, get support from one's family, or access to a good pre-existing network for fundraising do not strike me as strongly correlated with merit of either the founder, the org's theory, or the org itself. So I do have some concerns that expecting too much out of early-stage founders or ideas will give those (at most) weakly merit-based factors too much weigh in determining which founders, ideas, and orgs survive the infant-mortality period.

In general, I'd err on the side of encouraging public appeals rather than erring on the side of setting too high a bar. I think the average community member is pretty savvy, and the community's demonstrated deliberative skill in evaluating funding issues seems pretty strong. To the extent the community effort were too burdensome, I'd prefer something like people deferring somewhat to a ~randomly selected community screening jury (which could hopefully be at least medium-context) if the alternative were to discourage public appeals.

This post, while clearly well-intentioned, embodies a direction that risks stifling innovation and accessibility within the EA community. By emphasizing rigorous transparency and demanding extensive documentation, it inadvertently discourages EAs from seeking funding support from the community. We should be actively encouraging more EAs to seek funding for high-impact ideas they believe in, rather than setting up barriers that may deter them from doing so. Directionally, the EA community should be much more inclined to support projects that currently lack resources, recognizing that some of the best ideas might come from those who don’t yet have the means to demonstrate immediate, measurable impact.

The current standards, which prioritize detailed disclosures and polished presentations, favor those organizations that are already well-resourced and can bear the burden of compliance. This creates a significant barrier to entry, essentially making it easier for established projects to maintain funding while new, innovative ideas struggle to gain traction. We risk becoming a community that is penny-wise and pound-foolish, where funding disproportionately flows to projects with proven track records, leaving little room for the experimentation and risk-taking that drive real progress.

Moreover, this demand for rigorous transparency can inadvertently create a self-perpetuating cycle: only those who can "play the game" by adhering to existing norms and expectations can secure funding. As a result, smaller, newer projects that may have groundbreaking potential are often left out. This isn’t just a matter of encouraging transparency; it’s about ensuring that we don’t shut out the very voices that could bring fresh, impactful ideas to the table.

We should be updating our approach and actively shifting our focus to support a wider range of initiatives, including early-stage projects that might not yet have a track record but show real promise. This means creating pathways for these projects to thrive, acknowledging that early funding often involves calculated risks, and recognizing that fostering innovation requires more than just rigorous vetting—it requires a willingness to explore new ideas and approaches, even when they don’t fit neatly into existing molds.

The comparison to the for-profit sector is particularly concerning. In the for-profit world, stringent standards are designed to protect unsophisticated investors from being misled, but this is not the same context in which we operate. The EA community is built on a shared commitment to doing the most good, and we should be fostering a culture that supports experimentation, not one that limits it. We need to recognize that requiring new projects to adhere to rigid standards of transparency before they can secure funding risks pushing out precisely the kind of creative, high-impact initiatives we should be eager to explore.

Transparency and accountability are, of course, essential values, but they should not become tools that gatekeep or stifle innovation. The EA community should strive to balance these values by providing more flexible funding mechanisms that can accommodate early-stage projects. We need to be willing to take risks on new ideas, knowing that some will fail but that the potential rewards justify the investment.

If we truly want to make a difference, we must be open to supporting initiatives that might not have a perfect track record or a polished presentation yet. The focus should be on cultivating a diverse ecosystem of projects, where the potential for significant impact is given the opportunity to flourish. If we only support ideas for which evidential discovery has already been made, we’re simply exploiting existing knowledge rather than advancing new frontiers of impact.

In summary, while the call for transparency is understandable, it is directionally wrong for the EA community to set standards that disproportionately favor the well-resourced and established. We need to be updating far more in the direction of supporting new ideas and new organizations. The community must actively lower the barriers to entry, fostering a culture that is inclusive, open to new voices, and willing to take strategic risks on projects that might, with the right support, achieve extraordinary results.

 

EDIT: Wanted to add an important contrast between the for-profit and nonprofit which is a much higher appetite for risk in the for-profit, with investors often being OK with moonshots that have reasonable chances of becoming unicorns. In the nonprofit space, although there is lip-service paid to valuing moonshot projects, there is very little appetite for actually funding such projects.

According to ZeroGPT, this comment was 70% AI-generated.

I shared some of my thoughts regarding the post, asked ChatGPT to compile it into a more polished form and then went through several subsequent prompts until it conveyed what I was trying to say. It changed generating a comment from something that would have probably taken 1.5 hours of work to something that took about 15 minutes and generated what I wanted to say. 

What is your point in pointing out that it was AI-generated? AI is a good tool that could enable people to make their point in polished ways where time-constraints would otherwise make that impossible or otherwise too costly.

It changed generating a comment from something that would have probably taken 1.5 hours of work to something that took about 15 minutes and generated what I wanted to say. 

Although I can't directly compare the ChatGPT version to a hypothetical directly-written version of the comment, my hunch is that the former is about twice as long as the latter as the latter would have been. It's pretty common for AI to need many more words to express the same idea than a reasonably skilled human author. So in a sense, I think generative AI use often shifts the time burden of the author-reader joint enterprise from the author to the readers. This may or may not be a good tradeoff on the whole, but it is worth considering both sides.

My general take is that content authored with that level of AI assistance should be flagged as such, so the reader can make their own decision about whether to engage with it.

Then take issue with the post not being adequately concise, which better use with the AI probably could have accomplished.

Any gripe that you have with the content should be with the actual text, not tools that may have aided in generating it.

I do think the comment would have been much better received if it was more concise and simple to read (regardless of how it was written), see The value of content density 

Just expect to hear less, I guess.

generated what I wanted to say

Overall, do you stand by your comment? If I wrote a point-by-point response would some points get a "that's just something the LLM put in because it seemed plausible and isn't actually my view"?

Curated and popular this week
Relevant opportunities