Evan R. Murphy

AI Alignment Researcher @ Independent/Non-profit
Working (6-15 years of experience)
434Vancouver, BC, CanadaJoined Oct 2021

Bio

Formerly a software engineer at Google, now I'm doing independent AI alignment research.

Because of my focus on AI alignment, I tend to post more on LessWrong and AI Alignment Forum than I do here.

I'm always happy to connect with other researchers or people interested in AI alignment and effective altruism. Feel free to send me a private message!

Comments
59

Will all the results of the survey be shared publicly on EA Forum? I couldn't find mention about this in the couple announcements I've seen for this survey.

It looks like at least some of the 2020 survey results were shared publicly. [1, 2, 3] But I can't find 2021 survey results. (Maybe there was no 2021 EA Survey?)

Thanks for the link and highlights!

Sam claims that he donated to Republicans: "I donated to both parties. I donated about the same amount to both parties (...) That was not generally known (...) All my Republican donations were dark (...) and the reason was not for regulatory reasons - it's just that reporters freak the fuck out if you donate to Republicans [inaudible] they're all liberal, and I didn't want to have that fight". If true, this seems to fit the notion that Sam didn't just donate to look good (i.e. he donated at least partly because of his personal altruistic beliefs)

What do you mean that this donation strategy would be from Sam's "personal altruistic beliefs"? Donating equally to both political parties has been a common strategy among major corporations for a long time. It's a way for them to push their own agenda in government. It's generally an amoral self-interested strategy, not an altruistic one.

I am a big fan of gratitude practice. I try to write a little in a gratitude journal most nights, which has helped my overall state of mind since I started doing it. I would recommend anybody to try it, including people involved in EA. And I'm glad you suggested it, as a little gratitude during a crisis like this can be especially helpful.

I have some reservations about posting things I'm grateful for publicly on this forum though. Gratitude can be a bit vulnerable, and this forum has more eyes on it than usual lately. Posting to a community about why you're thankful for that community could also be misinterpreted as being obsequious or virtue signalling. I think most of the benefits of gratitude practice can be enjoyed privately or with someone you trust, but if other people felt inclined to share their gratitude here, I would probably enjoy reading it and not be judgmental. And I may change my mind later and post some of that here as well :)

I would probably more excited about this thread if the forum had a feature to post comments anonymously. I don't see any downside to an anonymous public gratitude thread, but I'm probably too lazy to create an anonymous account just for that purpose.

Ultimately this was a failure of the EA ideas more so than the EA community. SBF used EA ideas as a justification for his actions. Very few EAs would condone his amoral stance w.r.t. business ethics, but business ethics isn't really a central part of EA ideas. Ultimately, I think the main failure was EAs failing to adequately condemn naive utilitarianism. 

So I disagree with this because:

  1. It's unclear whether it's right to attribute SBF's choices to a failure of EA ideas. Following SBF's interview with Kelsey Piper and based on other things I've been reading, I don't think we can be sure at this point whether SBF was generally more motivated by naive utilitarianism or by seeking to expand his own power and influence. And it's unclear which of those headspaces led him to the decision to defraud FTX customers.
  2. It's plausible there actually were serious ways that the EA community failed with respect to SBF. According to a couple  accounts, at least several people in the community had reason to believe SBF was dishonest and sketchy. Some of them spoke up about it and others didn't. The accounts say that these concerns were shared with more central leaders in EA who didn't take a lot of action based on that information (e.g. they could have stopped promoting Sam as a shining example of an EA after learning of reports that he was dishonest, even if they continued to accept funding from him). [1]

    If this story is true (don't know for sure yet), then that would likely point to community failures in the sense that EA had a fairly centralized network of community/funding that was vulnerable, and it failed to distance itself from a known or suspected bad actor. This is pretty close to the OP's point about the EA community being high-trust and so far not developing sufficient mechanisms to verify that trust as it has scaled.

--

[1]: I do want to clarify that in addition to this story still not being unconfirmed, I'm mostly not trying to place a ton of blame or hostility on EA leaders who may have made mistakes. Leadership is hard, the situation sounds hard and I think EA leaders have done a lot of good things outside of this situation. What we find out may reduce how much responsibility I think the EA movement should put with those people, but overall I'm much more interested in looking at systemic problems/solutions than fixating on the blame of individuals.

Can you say a bit more about what you think EA has lost that makes it valuable?

Thanks for clarifying. That helps me understand your concern about the unilateralist's curse with funders acting independently. But i don't understand why the OP proposal of evaluating/encouraging funding diversification for important cause areas would exacerbate it. Presumably those funders could make risky bets regardless of this evaluation. Is it because you think it would bring a lot more funders into these areas or give them more permission to fund projects that they are currently ignoring?

Was it this post by chance? https://forum.effectivealtruism.org/posts/AbohvyvtF6P7cXBgy/brainstorming-ways-to-make-ea-safer-and-more-inclusive This one seems to be on a very similar topic. But it has a different name so it's probably not the same one but possibly Richard revised the title at some point.

Thanks for explaining, but who are you considering to be the "regulator" who is "captured" in this story? I guess you are thinking of either OpenPhil or OpenAI's board as the "regulator" of OpenAI. I've always heard the term "regulatory capture" in the context of companies capturing government regulators, but I guess it makes sense that it could be applied to other kinds of overseers of a company, such as its board or funder.

I've also been very upset since the FTX scandal began, and I love this community too. I think you're right that EA will lose some people. But I am not so worried the community will collapse (although it's possible that ending the global EA community could be a good thing). People's memories are short, and all things pass. In one year, I would be willing to bet you there will still be lots of (and still not enough!) good people working on and donating to important, tractable, and neglected causes. There will  still be an EA Forum with lively debates happening, and that arguments about FTX will by that point make up a small fraction of the content. There will be still new people discovering EA and getting inspired by the potential to increase their positive impact in the world.

To be sure, I do think we should be worried* about the future of EA right now. But more in the sense of worried about whether EA can remain true to its core values and ideals going forward than about whether it can survive in some form.

--

*Note that when I say "we should be worried", I actually mean "we should be putting careful attention toward" rather than "we should be consumed by anxiety about". Be kind to yourself, and if you're feeling more of the latter, now may be a good time to double down on self-care.

Can you clarify which "public hearings" were demanded? Not sure if you're talking about how quickly the bankruptcy process has been moving at FTX, or how the reactions from people on EA Forum since the news about FTX started.

Load More