SL

Sarah Levin

363 karmaJoined Nov 2022

Participation

    Comments
    21

    I mostly agree with your larger point here, especially about the relative importance of FTX, but early Leverage was far more rationalist than it was EA. As of 2013, Leverage staff was >50% Sequences-quoting rationalists, including multiple ex-SIAI and one ex-MetaMed, compared with exactly one person (Mark, who cofounded THINK) who was arguably more of an EA than a rationalist. Leverage taught at CFAR workshops before they held the first EA Summit. Circa 2013 Leverage donors had strong overlap with SIAI/MIRI donors but not with CEA donors. etc.

    I think trying to figure out the common thread "explaining datapoints like FTX, Leverage Research, [and] the LaSota crew" won't yield much of worth because those three things aren't especially similar to each other, either in their internal workings or in their external effects. "World-scale financial crime," "cause a nervous breakdown in your employee," and "stab your landlord with a sword" aren't similar to each other and I don't get why you'd expect to find a common cause. "All happy families are alike; each unhappy family is unhappy in its own way."

    There's a separate question of why EAs and rationalists tolerate weirdos, which is more fruitful. But an answer there is also gonna have to explain why they welcome controversial figures like Peter Singer or Eliezer Yudkowsky, and why extremely ideological group houses like early Toby Ord's [EDIT: Nope, false] or more recently the Karnofsky/Amodei household exercise such strong intellectual influence in ways that mainstream society wouldn't accept. And frankly if you took away the tolerance for weirdos there wouldn't be much left of either movement.

    Your “90% confidence interval” of… what, exactly? This looks like a confidence interval over the value of your own subjective probability estimate? And “90% as the mean” of… a bunch of different guesses you’ve taken at your “true” subjective probability? I can't imagine why anyone would do that but I can’t think what else this could coherently mean…?

    If I can be blunt, I suspect you might be repeating probabilistic terms without really tracking their technical meaning, as though you’re just inserting nontechnical hedges. Maybe it’s worth taking the time to reread the map/territory stuff and then run through some calibration practice problems while thinking closely about what you’re doing. Or maybe just use nontechnical hedges more, they work perfectly well for expressing things like this.

    ...What on earth does "90% probability, with medium confidence" mean? Do you think it's 90% likely or not?

    Great, this is useful data.

    Results demonstrated that FTX had decreased satisfaction by 0.5-1 points on a 10-point scale within the EA community, but overall community sentiment remained positive at ~7.5/10

    That's a big drop! In practice I've only ever seen this type of satisfaction scale give results between about 7/10 through 9.5/10 (which makes sense, right, if my satisfaction with EA is 3/10 then I'm probably not sticking around the community and answering member surveys), so that decline is a real big chunk of the scale's de facto range.

    I suppose it's not surprising that the impact on perception is much bigger inside EA, where there's (appropriately) been tons of discourse on this, than in the general public.

    Looking back five months later, can you say anything about whether this program ended up matching people with new jobs or opportunities, and if so how many? Thanks!

    Looking back five months later, can you say anything about whether this program ended up making grants, and if so how much/how many? Thanks!

    within the community we're working towards the same goals: you're not trying to win a fight, you're trying to help us all get closer to the truth.

     

    This is an aside, but it’s an important one:

    Sometimes we're fighting! Very often it’s a fight over methods between people who share goals, e.g. fights about whether or not to emphasize unobjectional global health interventions and downplay the weird stuff in official communication. Occasionally it’s a good-faith fight between people with explicit value differences, e.g. fights about whether to serve meat at EA conferences. Sometimes it’s a boring old struggle for power, e.g. SBF’s response to the EAs who attempted to oust him from Alameda in ~2018.

    Personally I think that some amount of fighting is critical for any healthy community. Maybe you disagree. Maybe you wish EA didn’t have any fighting. But acting as if this were descriptively true rather than aspirational is clearly incorrect.

    As many have noted, this recommendation will usually yield good results when the org responds cooperatively and bad results when the org responds defensively. It is an org’s responsibility to demonstrate that they will respond cooperatively, not a critic’s responsibility to assume. Defensive responses aren’t, like, rare.

    To be more concrete, I personally would write to Givewell before posting a critique of their work because they have responded to past critiques with deep technical engagement, blog posts celebrating the critics, large cash prizes, etc. I would not write to CEA before posting a critique of their work because they have responded to exactly this situation by breaking a confidentiality request in order to better prepare an adversarial public response to the critic's upcoming post. People who aren’t familiar with deep EA lore won’t know all this stuff and shouldn’t be expected to take a leap of faith.

    This does mean that posts with half-cocked accusations will get more attention than they deserve. This is certainly a problem! My own preferred solution to this would be to stop trusting unverifiable accusations from burner accounts. Any solution will face tradeoffs.

    (For someone in OP’s situation, where he has extensive and long-time knowledge of many key EA figures, and further is protected from most retaliation because he’s married to Julia Wise, who is a very influential community leader, I do indeed think that running critical posts by EA orgs will often be the right decision.)

    Load more