Evan R. Murphy

AI Alignment Researcher @ Independent/Non-profit
Working (6-15 years of experience)
568Vancouver, BC, CanadaJoined Oct 2021


Formerly a software engineer at Google, now I'm doing independent AI alignment research.

Because of my focus on AI alignment, I tend to post more on LessWrong and AI Alignment Forum than I do here.

I'm always happy to connect with other researchers or people interested in AI alignment and effective altruism. Feel free to send me a private message!


You're right - I wasn't very happy with my word choice calling Google the 'engine of competition' in this situation. The engine was already in place and involves the various actors working on AGI and the incentives to do so. But these recent developments with Google doubling down on AI to protect their search/ad revenue are revving up that engine.

It's somewhat surprising to me the way this is shaking out. I would expect DeepMind and OpenAI's AGI research to be competing with one another*. But here it looks like Google is the engine of competition, less motivated by any future focused ideas about AGI more just by the fact that their core search/ad business model appears to be threatened by OpenAI's AGI research.

*And hopefully cooperating with one another too.

I think it's not quite right that low trust is costlier than high trust. Low trust is costly when things are going well. There's kind of a slow burn of additional cost.

But high trust is very costly when bad actors, corruption or mistakes arise that a low trust community would have preempted. So the cost is lumpier, cheap in the good times and expensive in the bad.

(I read fairly quickly so may have missed where you clarified this.)

If anyone consults a lawyer about this or starts the process with FTXrepay@ftx.us , it could be very useful to many of us if you followed up here and shared what your experience of the process was like.

I'm surprised you were putting such high odds on it being a mistake at this point (even before the arrest). From my understanding (all public info), FTX's terms of service agreed that they would not touch customer funds. But then FTX loaned those funds to Alameda, who made risky bets with them.

IANAL but this seems to me like pretty clear case of fraud from FTX. I didn't think any of those aspects of the story were really disputed, but I have not been following the story as closely in the past week or so.

Will all the results of the survey be shared publicly on EA Forum? I couldn't find mention about this in the couple announcements I've seen for this survey.

It looks like at least some of the 2020 survey results were shared publicly. [1, 2, 3] But I can't find 2021 survey results. (Maybe there was no 2021 EA Survey?)

Thanks for the link and highlights!

Sam claims that he donated to Republicans: "I donated to both parties. I donated about the same amount to both parties (...) That was not generally known (...) All my Republican donations were dark (...) and the reason was not for regulatory reasons - it's just that reporters freak the fuck out if you donate to Republicans [inaudible] they're all liberal, and I didn't want to have that fight". If true, this seems to fit the notion that Sam didn't just donate to look good (i.e. he donated at least partly because of his personal altruistic beliefs)

What do you mean that this donation strategy would be from Sam's "personal altruistic beliefs"? Donating equally to both political parties has been a common strategy among major corporations for a long time. It's a way for them to push their own agenda in government. It's generally an amoral self-interested strategy, not an altruistic one.

I am a big fan of gratitude practice. I try to write a little in a gratitude journal most nights, which has helped my overall state of mind since I started doing it. I would recommend anybody to try it, including people involved in EA. And I'm glad you suggested it, as a little gratitude during a crisis like this can be especially helpful.

I have some reservations about posting things I'm grateful for publicly on this forum though. Gratitude can be a bit vulnerable, and this forum has more eyes on it than usual lately. Posting to a community about why you're thankful for that community could also be misinterpreted as being obsequious or virtue signalling. I think most of the benefits of gratitude practice can be enjoyed privately or with someone you trust, but if other people felt inclined to share their gratitude here, I would probably enjoy reading it and not be judgmental. And I may change my mind later and post some of that here as well :)

I would probably more excited about this thread if the forum had a feature to post comments anonymously. I don't see any downside to an anonymous public gratitude thread, but I'm probably too lazy to create an anonymous account just for that purpose.

Ultimately this was a failure of the EA ideas more so than the EA community. SBF used EA ideas as a justification for his actions. Very few EAs would condone his amoral stance w.r.t. business ethics, but business ethics isn't really a central part of EA ideas. Ultimately, I think the main failure was EAs failing to adequately condemn naive utilitarianism. 

So I disagree with this because:

  1. It's unclear whether it's right to attribute SBF's choices to a failure of EA ideas. Following SBF's interview with Kelsey Piper and based on other things I've been reading, I don't think we can be sure at this point whether SBF was generally more motivated by naive utilitarianism or by seeking to expand his own power and influence. And it's unclear which of those headspaces led him to the decision to defraud FTX customers.
  2. It's plausible there actually were serious ways that the EA community failed with respect to SBF. According to a couple  accounts, at least several people in the community had reason to believe SBF was dishonest and sketchy. Some of them spoke up about it and others didn't. The accounts say that these concerns were shared with more central leaders in EA who didn't take a lot of action based on that information (e.g. they could have stopped promoting Sam as a shining example of an EA after learning of reports that he was dishonest, even if they continued to accept funding from him). [1]

    If this story is true (don't know for sure yet), then that would likely point to community failures in the sense that EA had a fairly centralized network of community/funding that was vulnerable, and it failed to distance itself from a known or suspected bad actor. This is pretty close to the OP's point about the EA community being high-trust and so far not developing sufficient mechanisms to verify that trust as it has scaled.


[1]: I do want to clarify that in addition to this story still not being unconfirmed, I'm mostly not trying to place a ton of blame or hostility on EA leaders who may have made mistakes. Leadership is hard, the situation sounds hard and I think EA leaders have done a lot of good things outside of this situation. What we find out may reduce how much responsibility I think the EA movement should put with those people, but overall I'm much more interested in looking at systemic problems/solutions than fixating on the blame of individuals.

Load more