EA-adjacent-adjacent. opinions emphatically not those of my employer
When early digital experiments don’t show ROI, many orgs seem to conclude that the channel itself is misaligned, rather than that execution, resourcing, or the evaluation window were insufficient. Given small budgets and high standards of proof, it’s not surprising those early attempts fail — but that doesn’t tell us much about the counterfactual of sustained investment.
yeah I basically think this is the problem, and agree that some level of investment would yield a return, but small orgs can't just keep putting in time and money for hypothetical return at some undetermined threshold! we're not for-profits that can take out loans or get VC money to sink into big upfront acquisition costs :')
again, if any funders are interested in funding EA digital marketing experiments for audience growth, I'm all ears... I'd like to see a case study of what level of investment is needed for smallish orgs to see a return, especially for fundraising asks.
hm this is super interesting. I started Giving Green's comms/growth function, and at the time I remember talking to a bunch of EAs in comms and marketing functions for advice (RIP, EA comms slack!) - almost everyone said: digital marketing hasn't worked for us, what's worked is earned media and relationship-building. I don't really know whether that's due to underinvestment in good digital marketing, or the broader nonprofit fundraising environment, or the FTX collapse, or something else. But I think it's worth noting that a lot of us have experimented with digital marketing and not seen ROI. (And at least for Giving Green specifically, we do have people on staff who come from growth-oriented digital roles, myself sort of included, but just haven't really seen return from those channels.) I do think I see EA orgs investing more in digital marketing now and I am excited about the learnings we'll have in the ecosystem in the next few years.
A couple of other theories I have:
I'd be curious if you're seeing digital fundraising success in a space / with asks similar to EA orgs' fields/asks - I don't know if I can think of an example of this, even among our non-EA partners.
there was a post about this last year: https://forum.effectivealtruism.org/posts/oFcLqTETnC8rajxeg/advisors-for-smaller-major-donors
tl;dr is (1) a lot of evaluators will do this for their cause area (can't speak to every one but Giving Green is happy to advise donors of any size, just shoot us an email); (2) look into giving circles inside or outside EA
I'd add that it's probably worth seeking a financial advisor for the tax law and will writing type questions -- a lot of EA advisories offer free initial services, but I've been told that total assets >100k is generally the point at which it makes sense to find an advisor
Thank you for this! I had promised a couple of friends that I'd write up something for their upcoming spinouts and I'm very glad that someone who knows more than me has done it instead :D
Two caveats I'd suggest to future nonprofit-starters, just because they're currently a giant headache for me:
I love that High Impact Engineers is back and I generally like a forum-style place over Slack, but I want to push back specifically on "many engineers have a GitHub account" -- especially since your goal is to be welcoming to non-software-engineers, I wouldn't make this assumption! I was a materials engineering undergrad and none of my classes/internships/research projects used GitHub. Maybe it's really taken over in the last 10 years or something, but if not, you might want to consider being a bit more 101-level with the GitHub stuff, e.g. not using jargon like 'pull' or 'repo' without explanation -- when I see that kind of thing, at least personally my immediate reaction is "oh, this is a space for software engineers".
in addition to everything already said, I think this can be bad from an organizational sustainability perspective—if you decide to leave / get hit by a truck / etc, the organization now doesn't have the budget to hire someone new to do the work, meaning that some commitments will need to be dropped. Some funders will see this type of thing as a bad signal about the management of the organization.
Another way of leveraging your relative class privilege could be taking a part-time job and doing impactful volunteer work!
interestingly i've talked to a couple of other asian women in EA who have sort of an opposite experience—we (including myself here) feel like EA ideas and communities fundamentally don't capture things that are important to us as asian women, and so that actually forces us to be more balanced and draw our values from multiple places, rather than holding ourselves to a standard of being a more-optimized EA. one very literal example someone mentioned to me is that western cultures and traditions of thought emphasize breaking down systems into discrete parts and optimizing on single goals, and eastern cultures/traditions value more holistic thinking and balance.
i can also 100% see where you're coming from, as someone who grew up in a very competitive asian/immigrant community myself—i'm sharing another experience not to discount yours, but more to highlight alternate ways of thinking about your own relating to EA that might help you find that balance!
as a former optimizer: #1 thing that helped was therapy
I once saw on the Forum that someone had scraped the 990s from a bunch of EA and AI safety* orgs and put all the salaries in a spreadsheet, with names - it wouldn't be that hard to go from that to at least an estimate of what you're looking for, for the highest-paid employees. I can't find a link to the post anymore, and want to respect that they might have taken it down with good reason, but given it's public information, if some enterprising data-wrangling Forum-poster wants to dm me for it I'm not opposed to sharing the link...
*I do have a loose intuition that besides the grantmaker/grantee divide, the AI/not-AI divide within EA is driving some of the bizarre funding and salary dynamics