That's a helpful clarification, thank you. I would be concerned, then, that if an organization were motivated to get SoGive's seal of approval, they could improve their ratio by designating more of their money for specific purposes. Wouldn't it be pretty easy to write down a four-year (non-binding) plan that would convert much of the current "reserves" to "designated funds"?
I think the optimal level of reserves could vary significantly across organizations. In some cases, having a high level of reserves could make it easier to attract and retain key senior staff members. A 20-something EA might feel comfortable going to work for an org with a short runway, but someone mid-career with a family and who is asked to relocate might feel differently. Institutions and individuals might also be more inclined to collaborate with an organization that appears likely to be around for a while.
Suppose an organization spends 1/4 of its reserves every year, and earns a 5% return on those reserves. If I make a $1 donation, the org would increase its spending by $0.25 in year 1. In year 3 it would increase its spending by (0.75)*(1.05)*(0.25) = $0.20. Year 3 it would spend $0.16, Year 4 it would spend $0.12, etc. In the limit the full donation, plus accrued interest gets spent, even if it sits in a bank for a while. The timing would concern me only if I felt that money spent on nuclear security this year would be significantly more valuable than money spent in subsequent years.
I’m a little late to this discussion, but I first want to say thank you for this post! This is a topic I’ve been interested in of late and your post filled in a lot of gaps in my knowledge.
I see the potential for this to be tractable for EA funders and entrepreneurs, because non-EA funders might be incentivized to fund the deployment of interventions that are effective enough. It’s unfortunate that lonely people are less healthy and less productive, in addition to having lower life-satisfaction, but this phenomenon may have a silver lining. There might be interventions that benefit lonely people, and that also provide enough benefit to their employers and/or insurers for those companies to be financially motivated to provide them. An intervention might not clear the EA bar for $/DALY or $/WELLBY, but if it provides a favorable return on investment for employers and insurers, it has the potential to be deployed at scale. An EA funder might still need to fund the development of the intervention, demonstration of its effectiveness, and the spreading of information about its effectiveness. And EA entrepreneurs would need to do the work.
There’s an organization called the Foundation for Social Connection that is funded by the Coalition to End Social Isolation. Two of the four members of the latter’s steering committee are large health insurers, and there are some large corporations among their members. The Foundation for Social Connection has an Innovation Accelerator that could be a good resource for organizations interested in this space. Some of the researchers cited in this post are affiliated with the organization.
In addition to employers and insurers, school districts are another possible funder for effective interventions. I know of a Social-Emotional Learning curriculum that aims to help students make “intentional connections.” Schools could potentially see short-term benefits that justify the cost (primarily the opportunity cost of classroom time) if the intervention improves students’ behavior, engagement, and well-being. Long-term benefits could be significant as well, but these would be more difficult to demonstrate.
One other thought: there might be benefit to thinking more broadly in terms of improving people’s relationships or connectedness, rather than just addressing loneliness. I would suspect that most people would benefit from some combination of (1) increased awareness of the importance of relationships for their physical health and emotional wellbeing; and (2) resources to help them improve their existing relationships and to form new high quality relationships.
Thank you for sharing this! Do you think your program will work better for people with significant meditation experience? Do you think your own experience was somewhat contingent on the meditation work you did in the Finder's Course (beyond the discovery that you benefited from loving-kindness meditation, something more along the lines of the benefit from the meditation "reps" you'd been through)?
Another EA connection is that Samantha Power, the USAID Administrator who appointed Dean Karlan, is married to Cass Sunstein, who has spoken at EA Global and was once a guest on the 80,000 Hours podcast.
I find it disappointing that he tries to use EA as a shield (p 17, "As a believer in the Effective Altruism movement, my primary goal has never been personal enrichment; I'm motivated by a commitment to help bring happiness and alleviate suffering for others.") This is in the context of denying that he has billions of dollars stashed away. If he really cared about bringing happiness and alleviating suffering, why would he further tarnish the EA community's reputation by associating himself with it in testimony before Congress?
I think it depends at least in part on one's view of the long run value of crypto assets. I'm skeptical that they are worth what they are currently valued at, in aggregate (and am more skeptical that they were worth the prices they were trading at a year ago). So I think it would have been unethical for me personally to be paying for Super Bowl ads encouraging people to get into crypto. But if Tom Brady or whoever genuinely believed that it was in people's best interest to buy some crypto, I'm not really inclined to judge them for encouraging investment.
But I think there's a difference between investment and trading. It's harder to justify encouraging people to day trade, given that day traders lose money in aggregate (mostly via exchange fees). I'd be curious if someone could make a case for encouraging short-term trading.
Thank you for this post. I missed it when it was originally posted, and only came across it via the recent "Friendship Forever" post. An organization doing work in this area that might be of interest is the Foundation for Social Connection. They have an "Innovation Accelerator" that could potentially provide funding for projects addressing loneliness. It looks like they are funded in part by two large health insurance companies (Humana and United Healthcare), based on this.
Thanks for your reply. I do think it would be unusual to see such promises, particularly from a firm looking for large investments. And I would expect to see a bunch of disclaimers, as you suggest. There might have been such language in the actual investment documents, but still. The excerpt shared on Twitter would have set off red flags for me because it seems sloppy and unprofessional, and it would have made me particularly concerned about their risk management, but I wouldn't have concluded it was a Ponzi scheme or that there was something fraudulent going on with the reported returns.
It will be interesting to see if all of the FTX/Alameda fraud (if there was fraud, which seems very likely) took place after the most recent investment round. Investors may have failed not in financial diligence but in ensuring appropriate governance and controls (and, apparently, in assessing the character of FTX's leadership).