Nick_Beckstead

Nick_Beckstead's Comments

The EA Community and Long-Term Future Funds Lack Transparency and Accountability

Hi Evan, let me address some of the topics you’ve raised in turn.

Regarding original intentions and new information obtained:

  • At the time that the funds were formed, it was an open question in my mind how much of the funding would support established organizations vs. emerging organizations.
  • Since then, the things that changed were that EA Grants got started, I encountered fewer emerging organizations that I wanted to prioritize funding than expected, and Open Phil funding to established organizations grew more than I expected.
  • The three factors contributed to having fewer grants to make that couldn’t be made in other ways than was expected.
  • The former two factors contributed to a desire to focus primarily on established organizations.
  • The third opposes this, but I still see the balance of considerations favoring me focusing on established organizations.

Regarding my/CEA’s communications about the purposes of the funds: It seems you and some others have gotten the impression that the EA Funds I manage were originally intended to focus on emerging organizations over established organizations. I don’t think this is communicated in the main places I would expect it to be communicated if the fund were definitely focused on emerging organizations. For example, the description of the Long-Term Future Fund reads:

“This fund will support organizations that work on improving long-term outcomes for humanity. Grants will likely go to organizations that seek to reduce global catastrophic risks, especially those relating to advanced artificial intelligence.”

And “What sorts of interventions or organizations might this fund support?” reads:

"In the biography on the right you can see a list of organizations the Fund Manager has previously supported, including a wide variety of organizations such as the Centre for the Study of Existential Risk, Future of Life Institute and the Center for Applied Rationality. These organizations vary in their strategies for improving the long-term future but are likely to include activities such as research into possible existential risks and their mitigation, and priorities for robust and beneficial artificial intelligence."

The new grants also strike me as a natural continuation of the “grant history” section. Based on the above, I'd have thought the more natural interpretation was, "You are giving money for Nick Beckstead to regrant at his discretion to organizations in the EA/GCR space."

The main piece of evidence that these funds were billed as focused on emerging organizations that I see in your write-up is this statement under “Why might you choose not to donate to this fund?”:

“First, donors who prefer to support established organizations. The fund manager has a track record of funding newer organizations and this trend is likely to continue, provided that promising opportunities continue to exist.”

I understand how this is confusing, and I regret the way that we worded it. I can see that this could give someone the impression that the fund would focus primarily on emerging organizations, and that isn’t what I intended to communicate.

What I wanted to communicate was that I might fund many emerging organizations, if that seemed like the best idea, and I wanted to warn donors about the risks involved with funding emerging organizations. Indeed, two early grants from these funds were to emerging orgs: BERI and EA Sweden, so I think it's good that some warning was here. That said, even at the time this was written, I think “likely” was too strong a word, and “may” would have been more appropriate. It’s just an error that I failed to catch. In a panel discussion at EA Global in 2017, my answer to a related question about funding new vs. established orgs was more tentative, and better reflects what I think the page should have said.

I also think there are a couple of other statements like this on the page that I think could have been misinterpreted in similar ways, and I have regrets about them as well.

The EA Community and Long-Term Future Funds Lack Transparency and Accountability

Thanks for sharing your concerns, Evan. It sounds like your core concerns relate to (i) delay between receipt and use of funds, (ii) focus on established grantees over new and emerging grantees, and (iii) limited attention to these funds. Some thoughts and comments on these points:

  • I recently recommended a series of grants that will use up all EA Funds under my discretion. This became a larger priority in the last few months due to an influx of cryptocurrency donations. I expect a public announcement of the details after all grant logistics have been completed.

  • A major reason I haven’t made many grants is that most of the grants that I wanted to make could be made through Open Phil, and I’ve focused my attention on my Open Phil grantmaking because the amount of funding available is larger.

  • I am hopeful that EA Grants and BERI will provide funding to new projects in these areas. CEA and BERI strike me as likely to make good choices about funding new projects in these areas, and I think this makes sense as a division of labor. EA Grants isn’t immediately available for public applications, but I’m hopeful they’ll have a public funding round soon. BERI issued a request for proposals last month. As these programs mature, I expect that most of what is seen as funding gaps in these areas will be driven by taste/disagreement with these grantmakers rather than lack of funding.

For now, I don’t have any plans to change the focus or frequency of my grantmaking with these funds from what was indicated in my April 2018 update.

I think it’s probably true that a fund manager who has more time to manage these funds would be preferable, provided we found someone with suitable qualifications. This is a possibility that’s under consideration right now, but progress toward it will depend on the availability of a suitable manager and further thinking about how to allocate attention to this issue relative to other priorities.

Hi, I'm Holden Karnofsky. AMA about jobs at Open Philanthropy

In addition to, 35 days total. (I work at Open Phil.)

Hi, I'm Holden Karnofsky. AMA about jobs at Open Philanthropy

I don't mean to make a claim re: averages, just relaying personal experience.

Hi, I'm Holden Karnofsky. AMA about jobs at Open Philanthropy

I am a Program Officer at Open Philanthropy who joined as a Research Analyst about 3 years ago.

The prior two places I lived were New Brunswick, NJ and Oxford, UK. I live in a house with a few friends. It is 25-30m commute door-to-door via BART. My rent and monthly expenses are comparable to what I had in Oxford but noticeably larger than what I had in New Brunswick. I got pay increases when I moved to Open Phil, and additional raises over time. I’m comfortable on my current salary and could afford to get a single-bedroom apartment if I wanted, but I’m happy where I am.

Overall, I would say that it was an easy adjustment.

How important is marginal earning to give?

To avoid confusing people: my own annual contributions to charity are modest.

Should we launch a podcast about high-impact projects and people?

You might consider having a look at http://www.flamingswordofjustice.com/ . It's a podcast of interviews with activists of various types (pretty left-wing). I've listened to a few episodes and found it interesting. It was the closest thing I could think of that already exists.

Open Thread

I would love to see some action in this space. I think there is a natural harmony between what is best in Christianity--especially regarding helping the global poor--and effective altruism.

One person to consider speaking with is Charlie Camosy, who has worked with Peter Singer in the past (see info here). A couple other people to consider talking with would be Catriona Mackay and Alex Foster.

Cosmopolitanism

One attractive feature about cosmopolitanism in contrast with impartial benevolence is that impartial benevolence is often associated with denying that loved ones and family members are worthy targets of special concern, whereas I don't think cosmopolitanism has such associations. Another is that I think a larger fraction of educated people already have some knowledge about cosmopolitanism.

Good policy ideas that won’t happen (yet)

Niel, thanks for writing up this post. I think it's really worthwhile for us to discuss challenges that we encounter while working on EA projects with the community.

I noticed that this link in this sentence is broken:

Creating more disaster shelters to protect against global catastrophic risks (too weird)

Load More