Ben_West

Interim Managing Director @ CEA
12069 karmaJoined Sep 2014Working (15+ years)Panama City, Panama
🤷♂🤷♂🤷♂.ws

Bio

Non-EA interests include chess and TikTok (@benthamite). We are probably hiring: https://www.centreforeffectivealtruism.org/careers

How others can help me

Feedback always appreciated; feel free to email/DM me or use this link if you prefer to be anonymous.

Sequences
3

AI Pause Debate Week
EA Hiring
EA Retention

Comments
913

Topic Contributions
6

Oops that was supposed to link to this sequence, updated now. (That sequence isn't a complete list of everything that I and others at CEA have done, but it's the best I know of.)

I used the default sort ("Top").

(No opinion on which is more useful; I don't use Twitter much.)

I actually did that earlier, then realized I should clarify what you were trying to claim. I will copy the results in below, but even though they support the view that FTX was not a huge deal I want to disclaim that this methodology doesn't seem like it actually gets at the important thing.

But anyway, my original comment text:

As a convenience sample I searched twitter for "effective altruism". The first reference to FTX doesn't come until tweet 36, which is a link to this. Honestly it seems mostly like a standard anti-utilitarianism complaint; it feels like FTX isn't actually the crux. 

In contrast, I see 3 e/acc-type criticisms before that, two "I like EA but this AI stuff is too weird" things (including one retweeted by Yann LeCun??), two "EA is tech-bro/not diverse" complaints and one thing about Whytham Abbey.

And this (survey discussed/criticized here):

This is a good point – I've (anecdotally) seen one organization "go off the rails" because of a staff member who was behaving unethically but the CEO didn't feel like they had a mandate to just fire them without going through a bunch of formal process.

I guess it's by definition hard to precisely describe when one should deviate from a standard process; perhaps "get feedback from a bunch of experts" is the best advice you could give a CEO in such a situation. 

Ah yeah sorry, the claim of the post you criticized was not that FTX isn't mentioned in the press, but rather that those mentions don't seem to actually have impacted sentiment very much.

I thought when you said "FTX is heavily influencing their opinion" you were referring to changes in sentiment, but possibly I misunderstood you – if you just mean "journalists mention it a lot" then I agree.

My experience is that there are a bunch of metrics about startups which correlate with the founders' skill/effort better (though not perfectly) than exit value:

  1. Peak revenue run rate (and related metrics like EBITDA)
  2. Prestigiousness of investors
  3. Prestigious of incubator
  4. Amount of money raised
  5. Number of employees

And most of these metrics are publicly available.

I actually don't know a ton of people who are in the category of "founded something that was ex-ante plausible, put multiple years into it, but it didn't work out" so I'm mostly speculating, but my somewhat limited experience is that people will usually put on their resume stuff like "founded and grew my start up to $10M/year ARR with 30 employees backed by Sequoia" and this is impressive despite them not exiting successfully.[1]

  1. ^

    Though obviously ~100% of these founders would happily exchange that line on their resume for a fat check from having sold their company.

Ah yeah, certainly proving yourself in some way will make it easier for you to get funding.

Dumb question: have you considered immigrating to the US? The US has substantially more VC funding available than any other country.

we are seeing really very clear evidence that when people start getting informed, FTX is heavily influencing their opinion.

Thanks! Could you share said evidence? The data sources I cited certainly have limitations, having access to more surveys etc. would be valuable.

Thanks for the helpful comment – I had not seen John's dialogue and I think he is making a valid point.

Fair point that the lack of impact might not be due to attention span but instead things like having competing messages. 

In case you missed it: Angelina Li compiled some growth metrics about EA here; they seem to indicate that FTX's collapse did not "strangle" EA (though it probably wasn't good).

Thoughts on the OpenAI Board Decisions

A couple months ago I remarked that Sam Bankman-Fried's trial was scheduled to start in October, and people should prepare for EA to be in the headlines. It turned out that his trial did not actually generate much press for EA, but a month later EA is again making news as a result of recent Open AI board decisions.

A couple quick points:

  1. It is often the case that people's behavior is much more reasonable than what is presented in the media. It is also sometimes the case that the reality is even stupider than what is presented. We currently don't know what actually happened, and should hold multiple hypotheses simultaneously.[1]
  2. It's very hard to predict the outcome of media stories. Here are a few takes I've heard; we should consider that any of these could become the dominant narrative.
    1. Vinod Khosla (The Information): “OpenAI’s board members’ religion of ‘effective altruism’ and its misapplication could have set back the world’s path to the tremendous benefits of artificial intelligence”
    2. John Thornhill (Financial Times): One entrepreneur who is close to OpenAI says the board was “incredibly principled and brave” to confront Altman, even if it failed to explain its actions in public. “The board is rightly being attacked for incompetence,” the entrepreneur told me. “But if the new board is composed of normal tech people, then I doubt they’ll take safety issues seriously.”
    3. The Economist: “The chief lesson is the folly of policing technologies using corporate structures … Fortunately for humanity, there are bodies that have a much more convincing claim to represent its interests: elected governments”
  3. The previous point notwithstanding, people's attention spans are extremely short, and the median outcome of a news story is ~nothing. I've commented before that FTX's collapse had little effect on the average person’s perception of EA, and we might expect a similar thing to happen here.[2]
  4. Animal welfare has historically been unique amongst EA causes in having a dedicated lobby who is fighting against it. While we don't yet have a HumaneWatch for AI Safety, we should be aware that people have strong interests in how AI develops, and this means that stories about AI will be treated differently from those about, say, malaria.
  5. It can be frustrating to feel that a group you are part of is being judged by the actions of a couple people you’ve never met nor have any strong feelings about. The flipside of this though is that we get to celebrate the victories of people we’ve never met. Here are a few things posted in the last week that I thought were cool:
    1. The Against Malaria Foundation is in the middle of a nine-month bed net distribution which is expected to prevent 20 million cases of malaria, and about 40,000 deaths. (Rob Mather)
    2. The Shrimp Welfare Project signed an agreement to prevent 125 million shrimps per year from having their eyes cut off and other painful farming practices. (Ula Zarosa)
    3. The Belgian Senate voted to add animal welfare to their Constitution. (Bob Jacobs)
    4. Scott Alexander’s recent post also has a nice summary of victories.

 

 

  1. ^

     A collection of prediction markets about this event can be found here.

  2. ^

     Note that the data collected here does not exclude the possibility that perception of EA was affected in some subcommunities, and it might be the case that some subcommunities (e.g. OpenAI staff) do have a changed opinion, even if the average person’s opinion is unchanged

Load more