I run Antigravity Investments, an EA-aligned investing firm that helps EA nonprofits and individuals with investing. Our EA Forum article with public recommendations is mostly focused on cash management, although Appendix B discusses higher-EV investing options.
We give free advice and typically charge a low fee for directly managing portfolios. Feel free to reach out at firstname.lastname@example.org.
For a DIY approach in the United States, we recommend a portfolio with low-fee ETFs. A DIY approach with ETFs makes it possible to donate investments that have gone up in value without paying any taxes, which is an optimal way to donate.
Thanks for asking! At this time we do not specifically limit the types of early-stage high-impact activities that can apply. Early-stage nonprofits, for-profits, and personal projects would all fall under the scope of acceptable activity types.
The following arguments are ideas and have not been thoroughly researched. They may not reflect my actual views. Counterarguments are not mentioned because the OP is "mainly interested in seeing critiques." I may post counterarguments after the reward deadline has passed.
Claim to argue against: "$172,000 to the EA Hotel has at least as much EV as $172,000 distributed randomly to grantees from EA Meta Fund grantees or EA grants grantees."
Argument 1: The EA Hotel has a low counterfactually-adjusted impact
In this post, the EA Hotel states:
Out of 19 residents, 15 would be doing the same work counterfactually, but the hotel allows them to do, on average, 2.2 times more EA work -- as opposed to working a part time job to self-fund, or burning more runway.
This datapoint supports the view that most EA Hotel residents would be doing the same work whether or not they stay at the hotel. The claim that "the hotel allows them to do, on average, 2.2 times more EA work" could be incorrect. To gain more certainty about this, the EA Hotel should track what residents that are not accepted actually end up doing instead.
EA Hotel residents have many options to consider to do the same work while not staying at the hotel. For example, depending on the time and location requirements of the work, they could do some combination of: (1) part-time work to finance their living expenses, (2) living with parents, friends, or another location with near-zero living expenses, or (3) living in very low-cost housing that resembles the cost of the EA Hotel.
If someone pursues option (2), the EA Hotel is negative EV because someone can choose a free option instead of the EA Hotel, which consumes community funds.
If someone pursues options (1) and (3), they might only have to work a very limited amount of time. For example, I believe I recently heard of someone that was able to find a one bedroom living arrangement in Berkeley, CA in a large house for $500 a month, although they have to share a bathroom with many people. So someone might only need to do paid work 25% of the time and can do EA work 75% of the time. This suggests that the "2.2 times more EA work" figure greatly overstates the benefit of the EA Hotel in terms of reducing living expenses. Pursing options (1) and (3) seems to be feasible for the vast majority of people.
If direct funding allows people to pursue option (3) and secure low-cost housing, and if the cost is around the same as the EA Hotel, there may be no need for the EA Hotel itself to exist. The question becomes what is the counterfactually-adjusted impact of funding living expenses at the EA Hotel compared to option (3)? Adjustments should be made for things like missing out on the benefits of living elsewhere than Blackpool as well as relocation time and expenses which would further reduce counterfactual impact. The EA Hotel community certainly provides benefits, although coworking out of REACH may provide similar benefits.
Argument 2: The EA Hotel should charge users directly instead of raising funding
Rather than fundraising from EAs, the hotel should try to directly charge people who are benefiting from their services and community, which is an argument against donating to the hotel.
There doesn't seem to be a need to fund people who can afford the hotel. It's not clear what proportion of people fall under this category, but considering that it only takes 13 weeks of work at $15/hour to pay $7,900 for a one year stay at the hotel, it is possible that majority of residents can already afford to stay at the hotel.
For people who cannot afford the EA Hotel, applicants to funding organizations like EA Grants can include that they are requesting funding for living expenses and indicate EA Hotel expenses as part of their requested grant funding. EA Grants evaluators and other funders may be better equipped to evaluate the EV of projects people are working on as opposed to EA Hotel staff. If EA Grants can already cover this, there is no need to donate to the EA Hotel.
Argument 3: Funding projects has a higher impact than funding living expenses
I assume that EA Grants funds applicants' project expenses as well as their personal salary and living expenses. This could be higher impact than solely funding living expenses. Working at the EA Hotel with an unfunded project may be quite unproductive, particularly if the project requires funding to get anywhere. Seeking early-stage EA project funding seems to require waiting for long periods of time (perhaps months) for funders to get back to you rather than working full-time trying to acquire funding.
Argument 4: People should not donate to the EA Hotel until they improve their impact metrics and reporting
The EV estimation for the EA Hotel is highly mathematical and commenters have expressed that it is difficult to follow. Actual impact reporting appears to consist of testimonials which are hard to evaluate. It's even trickier to evaluate the counterfactually-adjusted impact.
There is probably a nontrivial number of people who do not seek support due to the presence of a fee, even if they can theoretically afford it (see trivial inconveniences). Unfortunately, I've seen this happen in practice.
The potential downside (and upside) of diversifying by adding some tilts and consistently sticking with them is limited, so I don’t see a major problem with “non-advanced investors” following the advice. Investors should be aware of things like rebalancing and capital gains tax; perhaps “intermediate investor” is a better term.
It takes a certain degree of investment knowledge and time to form an opinion about the historical performance of different factors and expected future performance. It also requires knowledge and time to determine how to appropriately incorporate factors into a portfolio and how to adjust exposure over time. For example, what should be done if a factor underperforms the market for a noticeable period of time? An investor needs to decide whether to reduce or eliminate exposure to a factor or not. Holding an investment that will continue to underperform is bad, but selling an investment that is experiencing cyclical underperformance is a bad timing decision which will worsen performance each time such an error is made.
As a concrete example, the momentum factor has had notable crashes throughout history that could cause concern and uncertainty among investors that were not expecting that behavior. Decisions to add factors to portfolios need to take into account maintaining an appropriate level of diversification, tax concerns (selling a factor fund could incur capital gains taxes, and factor mutual funds will pass capital gains the fund incurs while following factors onto investors whereas factor ETFs almost definitely won't), and the impact of fees, among other considerations.
This post was intended as a grant application announcement post that also happened to contain some information about new funder-friendly and applicant-friendly policies we are adopting. I did not include any information about our evaluation process or risk reduction process in the body of the post, so I would not expect the post to convey high awareness of either reasons why long-termist applications don't get funded.
I am curious what ideas we included you think address your first point about grantmakers being unable to vet the project. I'm not sure if application sharing, rolling applications, or providing feedback to grant applicants address your first or second points.
To elaborate more on risk, I wrote in another comment on this post that:
We have several layers of checks to help reduce risks and improve grant decision making including initial staff review of incoming applications, angels sharing their evaluations with one another and talking with external contacts/experts if appropriate, and hearing opinions of external grantmakers on grant applications we have received (we still need to talk with grantmakers to set this up).
I think that an initial staff review can help detect risks, and if we notice a large problem with downside risk in incoming projects, we can enhance the initial staff review process. The angel evaluation period is where a lot of nuanced considerations about risk can come up, since angels can share their perspectives on a grant proposal with other angels and external experts, and we have angels with significant experience in areas like meta and AI. Finally, this wasn't mentioned in the post, but we are aiming to share evaluations both ways with funders in EA. I think this can go a long way towards making all funders aware of all of the potential risks of a project.
Angels in the group seem to actively avoid funding projects that they feel they are not qualified to evaluate. Angels can point out funding behavior that they perceive is risky from other angels, although from what I've seen, our angels lean more on the side of risk avoidance than anything else.
High-quality grant applications tend to get funded quickly and are thereby eliminated from the pool of proposals available to the EA community, while applicants with higher-risk proposals tend to apply/pitch to lots of funders. This means that on average, proposals submitted to funders will be skewed towards high-downside-risk projects, and funders could themselves easily do harm if they end up supporting many of them. I'd be interested in your thoughts on that.
As Denise mentioned in a post on Jan's project evaluation idea, there is a category of project that is "projects which are simply bad because they do have approximately zero impact, but aren't particularly risky. I think this category is the largest of the the four." This lines up with many of the applications I am seeing. This might be different with long-term/x-risk projects specifically, but since we are a general funding group with individual EA funders with a wide variety of backgrounds and experiences, we are not receiving a large number of such applications relative to the entire pool of applications.
Therefore, I wouldn't say that our applications are likely to be "skewed towards high-downside-risk projects." I expect to continue to receive a large number of projects that may have very low impact just like other funders are likely receiving. As Oliver mentioned, "in practice I think people will have models that will output a net-positive impact or a net-negative impact, depending on certain facts that they have uncertainty about, and understanding those cruxes and uncertainties is the key thing in understanding whether a project will be worth working on." I think that other EA funders will fund projects that match the model of the funders, but because people's models differ wildly and are very likely wrong in many cases due to the high failure rate of funded startups for the most successful VCs, I don't know if other funders are actually funding a significant fraction of the opportunities that end up having the highest impact.
To my understanding EA Grants is the only other funder that is funding general grants, with BERI Grants and EAF Fund focusing on long-term projects exclusively, and the EA Funds focusing on their respective areas and funding larger organizations as well. Since EA Grants is currently closed for applications (I support rolling applications rather than application rounds), we are receiving applications that have not been funded by other funders because the only other funder isn't accepting applications right now. Since I support funder application sharing, with this method funders will be able to see the entire pool of proposals, rather than the pool without the projects other funders have funded. This will help each funder evaluate the quality of the projects they are funding relative to the quality of other projects that other funders have funded.
I really like that you're providing feedback to applicants! In general, I wish the EA community was more proactive with providing critical feedback.
Thanks! I completely agree.
I think it is fair to say you expected very low risk from creating an open platform where people would just post projects and seek volunteers and funding, while I expected with minimum curation this creates significant risk (even if the risk is coming from small fraction of projects). Sorry if I rounded off suggestions like "let's make an open platform without careful evaluation and see" and "based on the project ideas lists which existed several years ago the amount of harmful projects seems low" to "worrying about them is premature".
The community has already had many instances of openly writing about ideas, seeking funding on the EA Forum, Patreon, and elsewhere, and posting projects in places like the .impact hackpad and the currently active EA Work Club. Since posting about projects and making them known to community members seems to be a norm, I am curious about your assessment of the risk and what, if anything, can be done about it.
Do you propose that all EA project leaders seek approval from a central evaluation committee or something before talking with others about and publicizing the existence of their project? This would highly concern me because I think it's very challenging to predict the outcomes of a project, which is evidenced by the fact that people have wildly different opinions on how good of an idea or how good of a startup something is. Such a system could be very negative EV by greatly reducing the number of projects being pursued by providing initial negative feedback that doesn't reflect how the project would have turned out or decreasing the success of projects because other people are afraid to support a project that did not get backing from an evaluation system. I expect significant inaccuracy from my own project evaluation system as well as the project evaluation systems of other people and evaluation groups.
Thanks - both of that happened after I posted my comment, and also I still do not see the numbers which would help me estimate the ratio of projects which applied and which got funded. I take as mildly negative signal that someone had to ask, and this info was not included in the post, which solicits project proposals and volunteer work.
In my model it seems possible you have something like chicken-and-egg problem, not getting many great proposals, and the group of unnamed angels not funding many proposals coming via that pipeline.
If this is the case and the actual number of successfully funded projects is low, I think it is necessary to state this clearly before inviting people to work on proposals. My vague impression was we may disagree on this, which seems to indicate some quite deep disagreement about how funders should treat projects.
I wrote about the chicken and the egg problem here. As noted in my comments on the announcement post, the angels have significant amounts of funding available. Other funders do not disclose some of these statistics, and while we may do so in the future, I do not think it is necessary before soliciting proposals. The time cost of applying is pretty low, particularly if people are recycling content they have already written. I think we are the first grantmaking group to give all applicants feedback on their application which I think is valuable even if people do not get funded.
The whole context was, Ryan suggested I should have sought some feedback from you. I actually did that, and your co-founder noted that he will try to write the feedback on this today or tomorrow, on 11th of Mar - which did not happen. I don't think this is large problem, as we had already discussed the topic extensively.
Ben commented on your Google Document that was seeking feedback. I wouldn't say we've discussed the topic "extensively" in the brief call that we had. The devil is in the details, as they say.
John Maxwell brought up some interesting points. He suggests that platforms can experience the chicken and egg problem when it comes to getting started, and that intensive networking is a way to overcome this issue. I agree that platforms often have this problem, but the EA Angel Group resolved this not by networking intensely but instead by offering a lot of value to angels. This would incentivize them to join the platform even without a large number of existing grant applicants which would in turn incentivize grant applicants to apply.
Of course, we do need a stream of incoming grant applications to remain viable, and unfortunately we encountered some unexpected issues when attempting to collaborate with EA Grants and speak to many community members as part of several strategies to acquire grant applications. As mentioned in my progress update comment, I am currently pursuing alternate strategies to achieve this objective which involve steps that I have greater control over (and less steps that require the approval of entities whose decisions I cannot influence). That being said, I think networking and collaboration is highly valuable, and am scaling that up even as I pursue strategies that do not require networking to succeed.
I wrote a progress update comment regarding the EA Angel Group which covered our grant opportunity discovery activities over the last few months. We spoke with EA Grants several months ago, and to the best of my knowledge they are still determining whether to send and receive grant applications with other funders. At least one major funding group has expressed significant interest in sending and receiving grant applications with the EA Angel Group, and we are in the process of talking with various funders about this.
I mentioned the one concern I heard and my response to it in my progress update comment:
One objection to sharing grant applications among funders is that a funder would fund all of the grant proposals they felt were good and classify all other grant proposals as not suitable to be funded. From the funder's perspective, sharing the unfunded grant proposals would be bad since other organizations could subsequently fund them, and the funder classified those grant proposals as not worth funding. I personally disagree with this objection because the argument assumes that a funder has developed a grant evaluation process that can actually identify successful projects with a high degree of accuracy. Since the norm in the for-profit world involves large and successful venture capital firms with lots of experienced domain experts regularly passing on opportunities that later become multibillion-dollar companies, I find it unlikely that any EA funding organization will develop a grant evaluation process that is so good it justifies hiding some or all unfunded applications.
Can you elaborate on:
I think for example that a ‘just-another-universal-protocol’ worry would be very reasonable to have here.
Are you suggesting that funders may be concerned about adopting a protocol which ends up providing limited value? As I've stated in several other comments, I think sharing grant applications can be of considerable value since arbitrarily limiting the pool of projects seems pretty suboptimal.
To avoid that I think we need to do the hard work of reaching out to involved parties and have many conversations to incorporate their most important considerations and start mutually useful collaborations. I.e. consensus building.
I agree. I did some initial outreach at first and will begin additional outreach shortly.