I think this is too pessimistic: why did one of Biden's cabinet ask for Christiano in one of the top positions at the US gov's AI safety org if the government will reliably prioritize the sort of factors you cite here to the exclusion of safety?: https://www.nist.gov/news-events/news/2024/04/us-commerce-secretary-gina-raimondo-announces-expansion-us-ai-safety
I also think that whether or not the government regulates private AI has little to do with whether it militarizes AI. It's not like there is one dial with "amount of government" and it just gets ...
I trust EV more than the charity commission about many things, but whether EV behaved badly over SBF is definitely not one of them. One judgment here is incredibly liable to distortion through self-interest and ego preservation, and it's not the charity commission's. (That's not a prediction that the charity commission will in fact harshly criticize EV. I wouldn't be surprised either way on that.)
'also on not "some moral view we've never thought of".'
Oh, actually, that's right. That does change things a bit.
People don't reject this stuff, I suspect, because there is frankly, a decently large minority of the community who thinks "black people have lower IQs for genetic reasons" is suppressed forbidden knowledge. Scott Alexander has done a lot, entirely deliberately in my view, to spread that view over the years (although this is probably not the only reason), and Scott is generally highly respected within EA.
Now, unlike the people who spend all their time doing race/IQ stuff, I don't think more than a tiny, insignificant fraction of the people in the com...
Materialism is an important trait in individuals, and plausibly could be an important difference between groups. (Certainly the history of the Jewish people attests to the fact that it has been considered important in groups!) But the horrific recent history of false hypotheses about innate Jewish behavior helps us see how scientifically empty and morally bankrupt such ideas really are.
Coincidentally, I recently came across an academic paper that proposed a partial explanation of the current East Asian fertility crisis (e.g., South Korea's fertility dec...
I find it easy to believe there was a heated argument but no threats, because it is easy for things to get exaggerated, and the line between telling someone you no longer trust them because of a disagreement and threatening them is unclear when you are a powerful person who might employ them. But I find Will's claim that the conversation wasn't even about whether Sam was trustworthy or anything related to that, to be really quite hard to believe. It would be weird for someone to be mistaken or exaggerate about that, and I feel like a lie is unlikely, simply because I don't see what anyone would gain from lying to TIME about this.
Nathan's comment here is one case where I really want to know what the people giving agree/disagree votes intended to express. Agreement/disagreement that the behaviour "doesn't sound like Will'? Agreement/disagreement that Naia would be unlikely to be lying? General approval/disapproval of the comment?
I disagree-voted because I have the impression that there's a camp of people who left Alameda that has been misleading in their public anti-SBF statements, and has a separate track record of being untrustworthy.
So, given that background, I think it's unlikely that Will threatened someone in a strong sense of the word, and possible that Bouscal or MacAulay might be misleading, though I haven't tried to get to the bottom of it.
Yes, but not at great length.
From my memory, which definitely could be faulty since I only listened once:
He admits people did tell him Sam was untrustworthy. He says that his impression was something like "there was a big fight and I can't really tell what happened or who is right" (not a direct quote!). Stresses that many of the people who warned him about Sam continued to have large amounts of money on FTX later, so they didn't expect the scale of fraud we actually saw either. (They all seem to have told TIME that originally also.) Says Sam w...
Will's expressed public view on that sort of double or nothing gamble is hard to actually figure out, but it is clearly not as robustly anti as commonsense would require, though it is also clearly a lot LESS positive than SBF's view that you should obviously take it: https://conversationswithtyler.com/episodes/william-macaskill/
(I haven't quoted from the interview, because there is no one clear quote expressing Will's position, text search for "double" and you'll find the relevant stuff.)
Actually, I have a lot of sympathy with what you are saying here. I am ultimately somewhat inclined to endorse "in principle, the ends justify the means, just not in practice" over at least a fairly wide range of cases. I (probably) think in theory you should usually kill one innocent person to save five, even though in practice anything that looks like doing that is almost certainly a bad idea, outside artificial philosophical thought experiments and maybe some weird but not too implausible scenarios involving war or natural disaster. But at the same time...
I don't necessarily disagree with most of that, but I think it is ultimately still plausible that people who endorse a theory that obviously says in principle bad ends can justify the means are somewhat (plausibly not very much though) more likely to actually do bad things with an ends-justifies-the-means vibe. Note that this is an empirical claim about what sort of behaviour is actually more likely to co-occur with endorsing utilitarianism or consequentialism in actual human beings. So it's not refuted by "the correct understanding of consequentialism mos...
I don't necessarily disagree with any of that, but the fact that you asserted it implicates you think it has some kind of practical relevance which is where I might want to disagree.
I think it's fundamentally dishonest (a kind of naive instrumentalism in its own right) to try to discourage people from having true beliefs because of faint fears that these beliefs might correlate with bad behavior.
I also think it's bad for people to engage in "moral profiling" (cf. racial profiling), spreading suspicion about utilitarians in general based on very speculative...
The 3% figure for utilitarianism strikes me as a bit misleading on it's own, given what else Will said. (I'm not accusing Will of intent to mislead here, he said something very precise that I, as a philosopher, entirely followed, it was just a bit complicated for lay people.) Firstly, he said a lot of the probability space was taken up by error theory, the view that there is no true morality. So to get what Will himself endorses, whether or not there is a true morality, you have to basically subtract an unknown but large amount for his credence in error th...
My memory of the podcast (could be wrong, only listened once!) is that Will said that, conditional on error theory being false, his credence in consequentialism, is about 0.5.
I think he meant conditional on error theory being false, and also on not "some moral view we've never thought of".
Here's a quote of what Will said starting at 01:31:21: "But yeah, I tried to work through my credences once and I think I ended up in like 3% in utilitarianism or something like. I mean large factions go to, you know, people often very surprised by this, but large fact...
Actually, Wikipedia is characterizing the US bills a bit misleadingly: at least the one I looked at is not a full legal ban on trans women using women's bathrooms, but seemed to only cover schools specifically.
The answer to 2 is almost certainly yes (binary) trans people in the UK are legally allowed to use a toilet that matches their gender identity, as far as I can tell. The first hit on google for 'are trans women allowed to use women's toilets in the UK' is an official Metropolitan (i.e. London) Police doc which states:
'If someone (whether binary or non-binary) presents as, say, female then they use the female toilet and vice versa. There is no law or policy prohibiting anyone from using whichever toilet matches their gender identity, and a trans* individual...
But I do take your point that Will was not the only person involved, or necessarily the most important internally.
I think part of the issue is that the Time article sort of contains a very serious allegation about Will specifically, which he hasn't (yet*) publicly given evidence against (or clearly denied), namely Naia Bouscal's allegation that he threatened Tara Mac Aulay. I say "sort of", because saying someone "basically" X-ed, where X-ing is bad arguably carries the implication that it maybe wasn't really X exactly, but something a bit like it. Which makes it a bit hard to tell exactly what Naia was actually accusing Will of. Alas, as far as I know, she's not said...
But I do take your point that Will was not the only person involved, or necessarily the most important internally.
I must say that, given that I know from prior discussion on here that you are not Will's biggest fan, your attempt to be fair here is quite admirable. There should maybe be an "integrity" react button?
I think that even if you buy that, Will's behavior is still alarming, just in a different way. Why exactly should we, as a community, think of ourselves as being fitted to steer public opinion? Weren't we just meant to be experts on charity, rather than everything under the sun? (Not to mention that Musk is not the person I would choose to collaborate with on that, but that's for another day.) Will complains about Sam's hubris, but what could be more hubristic than that?
I remember feeling nervous when I first started working in EA that (otherwi...
Yeah, I think just buying Twitter to steer the narrative seems quite bad. But like, I have spent a large fraction of my career trying to think of mechanism design for discussion and social media platforms and so my relation to Twitter is I think a pretty healthy "I think I see lots of ways in which you could make this platform much more sanity-promoting" in a way that isn't about just spreading my memes and ideologies.
Will has somewhat less of that background, and I think would have less justified confidence in his ability to actually make the platform better from a general sanity perspective, though still seems pretty plausible to me he saw or sees genuine ways to make the platform better for humanity.
I strongly suspect Will is trying to avoid being sued.
Even from a purely selfish point of view, explicitly apologising and saying "sorry I made a mistake in not trusting the people who told me Sam behaved badly at Alameda", since it would actually help restore his reputation a bit.
EDIT: Actually the bit about Beckstead here is wrong, I'd misremembered the exact nature of the suit. See Jason's comment below. But Nick Beckstead has already been sued (unsuccessfully) on the grounds that he in some sense endorsed SBF whilst he "should have known" that he was...
Can you link to a discussion of the suit in question? I don't think that would be an accurate characterization of the suit I am aware of (complaint analyzed here). That suit is about roughly about aiding SBF in making specific transfers in breach of his fiduciary duty to Alameda. I wouldn't say it is about "endors[ing]" in the ordinary meaning of the word, or that it relied on allegations that Nick should have known about the criminal frauds going on at FTX.
That being said, I do agree more generally that "people who had a role at FTXFF" tend to be at the t...
If this is the case that MacAskill cannot be forthcoming for valid reasons (opening himself up to legal vulnerability), as a community it would still make sense for us to err on the side of caution and have other leaders for this community as Chris argues for.
I meant something in between "is" and "has a non-zero chance of being", like assigning significant probability to it (obviously I didn't have an exact number in mind), and not just for base rate reasons about believing all rich people to be dodgy.
I feel like "people who worked with Sam told people about specific instances of quite serious dishonesty they had personally observed" is being classed as "rumour" here, which whilst not strictly inaccurate, is misleading, because it is a very atypical case relative to the image the word "rumour" conjures. Also, even if people only did receive stuff that was more centrally rumour, I feel like we still want to know if any one in leadership argued "oh, yeah, Sam might well be dodgy, but the expected value of publicly backing him is high because of the upside...
I feel like "people who worked with Sam told people about specific instances of quite serious dishonesty they had personally observed" is being classed as "rumour" here, which whilst not strictly inaccurate, is misleading, because it is a very atypical case relative to the image the word "rumour" conjures.
I agree with this.
...[...] I feel like we still want to know if any one in leadership argued "oh, yeah, Sam might well be dodgy, but the expected value of publicly backing him is high because of the upside". That's a signal someone is a bad leader in my view
https://twitter.com/RichardHanania/status/1657541010745081857?lang=en. There you go for the quote in the form Wikipedia gives it.
Please people, do not treat Richard Hannania as some sort of worthy figure who is a friend of EA. He was a Nazi, and whilst he claims he moderated his views, he is still very racist as far as I can tell.
Hannania called for trying to get rid of all non-white immigrants in the US, and the sterilization of everyone with an IQ under 90 indulged in antisemitic attacks on the allegedly Jewish elite, and even post his reform was writing about the need for the state to harass and imprison Black people specifically ('a revolution in our culture or form of governmen...
I'd just like to clarify that my blogroll should not be taken as a list of "worthy figure[s] who [are] friend[s] of EA"! They're just blogs I find often interesting and worth reading. No broader moral endorsement implied!
fwiw, I found TracingWoodgrains' thoughts here fairly compelling.
ETA, specifically:
...I have little patience with polite society, its inconsistencies in which views are and are not acceptable, and its games of tug-of-war with the Overton Window. My own standards are strict and idiosyncratic. If I held everyone to them, I'd live in a lon
Your comment seems a bit light on citations, and didn't match my impression of Hanania after spending 10s of hours reading his stuff. I've certainly never seen him advocate for an authoritarian government as a means of enforcing a "natural" racial hierarchy. This claim stood out to me:
Hannania called for trying to get rid of all non-white immigrants in the US
Hanania wrote this post in 2023. It's the first hit on his substack search for "immigration". This apparent lack of fact-checking makes me doubt the veracity of your other claims.
It seems like ...
I don't think it makes any sense to punish people for past political or moral views they have sincerely recanted. There is some sense in which it shows bad judgement but ideology is a different domain from most. I am honestly quite invested in something like 'moral progress'. Its a bit of a naive position to have to defend philosophically but I think most altruists are too. At least if they are being honest with themselves. Lots of people are empirically quite racist. Very few people grew up with what I would consider to be great values. If someone sincere...
I'd like to give some context for why I disagree.
Yes, Richard Hanania is pretty racist. His views have historically been quite repugnant, and he's admitted that "I truly sucked back then". However, I think EA causes are more important than political differences. It's valuable when Hanania exposes the moral atrocity of factory farming and defends EA to his right-wing audience. If we're being scope-sensitive, I think we have a lot more in common with Hanania on the most important questions than we do on political issues.
I also think Hanania has excellent tak...
I think it's pretty unreasonable to call him a Nazi--he'd hate Nazis, because he loves Jews and generally dislikes dumb conservatives.
I agree that he seems pretty racist.
I have very mixed views on Richard Hannania.
On one hand, some of his past views were pretty terrible (even though I believe that you've exaggerated the extent of these views).
On the other hand, he is also one of the best critics of conservatives. Take for example, this article where he tells conservatives to stop being idiots who believe random conspiracy theories and another where he tells them to stop scamming everyone. These are amazing, brilliant articles with great chutzpah. As someone quite far to the right, he's able to make these points far more cr...
Given his past behavior, I think it's more likely than not that you're right about him. Even someone more skeptical should acknowledge that the views he expressed in the past and the views he now expresses likely stem from the same malevolent attitudes.
But about far-left politics being 'not racist', I think it's fair to say that far-left politics discriminates in favor or against individuals on the basis of race. It's usually not the kind of malevolent racial discrimination of the far-right - which absolutely needs to be condemned and eliminated by society...
When someone makes the accusation that transhumanism or effective altruism or longtermism or worries about low birth rates is a form of thinly veiled covert racism, I generally think they don’t really understand the topic and are tilting at windmills.
But then I see people who are indeed super racist talking about these topics and I can’t really say the critics are fully wrong. Particularly if communities like the EA Forum or the broader online EA community don’t vigorously repudiate the racism.
https://twitter.com/letsgomathias/status/1687543615692636160 (Just so people can get a sense of how very bad his views at least were, and could still be.)
I think she provided excellent evidence that at least some of your sources are in fact accurately characterized as "Nazi". Did you actually read the article she linked?
This is a meta-level point, but I'd be very, very wary of giving any help to Hanania if he attempts (even sincerely) to position himself publicly as a friend of EA. He was outed as having moved in genuinely and unambiguously white supremacist political circles for years a while ago. And while I accept that repentance is possible, and he claims to have changed (and probably has become less bad), I do not trust someone at all who had to have this be exposed rather than publicly owning up and denouncing his (allegedly) past views of his own accord, especially...
'Naive consequentialist plans also seem to have increased since FTX, mostly as a result of shorter AI timelines and much more involvement of EA in the policy space.'
This gives me the same feeling as Rebecca's original post: that you have specific information about very bad stuff that you are (for good or bad reasons) not sharing.
Yes, Harris should have asked Will about this: https://time.com/6262810/sam-bankman-fried-effective-altruism-alameda-ftx/
Wasn't the OpenAI thing basically the opposite of the mistake with FTX though? With FTX people ignored what appears to have been a fair amount of evidence that a powerful, allegedly ethical businessperson was in fact shady. At OpenAI, people seem to have got (what they perceived as, but we've no strong evidence they were wrong) evidence, that a powerful, allegedly ethically motivated businessperson was in fact shady, so they learnt the lessons of FTX and tried to do something about it (and failed.)
'Am I correct in interpreting your comment as something like "Rebecca says it's costly to say more which might imply she is sitting on not yet disclosed information that might put powerful EAs in a bad light"?'
Yes, that's what I meant. Maybe not "not all ready disclosed" though. It might just be confirmation that the portraited painted here is indeed fair and accurate: https://time.com/6262810/sam-bankman-fried-effective-altruism-alameda-ftx/ EDIT: I don't doubt that the article is broadly literally accurate, but there's always a big gap between what...
As am aside, this isn't really action relevant, but insofar as being involved with the legal system is a massive punishment even when the legal system itself is very likely going to eventually come to the conclusion you've done nothing legally wrong, that seems bad? Here it also seems to be having a knock on effect of making it harder to find out what actually happened, rather than being painful but producing useful information.
The suit against Brady also sounds like a complete waste of society's time and money to me.
Who would be able to sue? Would it really be possible for FTX customers/investors to sue someone for not making public "I heard Sam lies a lot and once misplaced money at Alameda early on it and didn't seem too concerned, and reneged on a verbal agreement to share ownership". Just because someone worked at the Future Fund? Or even someone who worked at EV?
I'd note that Nick Beckstead was in active litigation with the Alameda bankruptcy estate until that was dismissed last month (Docket No. 93). I think it would be very reasonable for anyone who worked at FTXFF to be concerned about their personal legal exposure here. (I am not opining as to whether exposure exists, only that I would find it extremely hard to fault anyone who worked at FTXFF for believing that they were at risk. After all, Nick already got sued!)
It's harder to assess exposure for other groups of people. To your question, there may be a diffe...
The complaints here seem to be partly about HOW EtG is promoted, rather than how MUCH. Though I am mildly skeptical that people in fact did not warn against doing harm to make money while promoting EtG, and much more skeptical that SBF would have listened if they had done this more.
True. We should make sure any particular safeguard wasn't in place around how people advocated for it before assuming it would have helped though. For what it's worth my sense is that a much more culpable thing was not blowing the whistle on Sam's bad behaviour at early Alameda even after Will and other leaders-I forget exactly who, if it's even known-were informed about it. That mistake was almost certainly far less consequential for the people harmed by FTX (I don't think it would have stopped the fraud; it might have protected EA itself), but I strongly suspect it was more knowably wrong at the time than anything anyone did or said about EtG as a general idea.
I think there are two separate but somewhat intertwined chains of inquiry under discussion here:
Also, I don't know if Spencer Greenberg's podcast with Will is recorded yet, but if it hasn't been I think he absolutely should ask Will what he thinks the phrase about "extensive and significant mistakes" here actually refers to. EDIT: Having listened (vaguely, while working) to most of the Sam Harris interview with Will, as far as I can tell Harris entirely failed to ask anything about this, which is a huge omission. Another question Spencer could ask Will is: did you specify this topic was off-limits to Harris?
I felt the Sam Harris interview was disappointingly soft and superficial. To be fair to MacAskill, Harris did an unusually bad job of pushing back and taking a harder line, and so MacAskill wasn't forced to get deeper into it.
And basically nothing about how to avoid a similar situation happening again? Except for a few lines about decentralisation. Quite uninspiring.
I mostly agree with this, and upvoted strongly, but I don't think the scare quotes around "criticism" is warranted. Improving ideas and projects through constructive criticism is not the same thing as speaking truth to power, but it is still good and useful, it's just a different good and useful thing.
Also, I feel mean for pressing the point against someone who is clearly finding this stressful and is no more responsible for it than anyone else in the know, but I really want someone to properly explain what the warning signs the leadership saw were, who saw them, and what was said internally in response to them. I don't even know how much that will help with anything, to be honest, so much as I just want to know. But at least in theory, anyone who behaved really badly should be removed from positions of power. (And I do mean just that: positions where t...
'and think confusion on this issue has indirectly resulted in a lot of harm.'
Can you say a bit more about this?
I don't actually find either all THAT reassuring. The GW blogpost just says most nets are used for their intended purpose, but 30% being used otherwise is still a lot, not to mention they can be used for their intended purpose and the later to fish. The Cold Takes blog post just cites the same data about most nets being used for their intended purpose.
I don't think it's necessary, no. But I do think some early critics of EtG were motivated at least partly by a general anticapitalist case that business or at least finance careers were generically morally problematic in themselves.
Any claim that advising people to earn to give is inherently really bad needs to either defend the view that "start a business or take another high paying job" is inherently immoral advice, or explain why it becomes immoral when you add "and give the money to charity" or when it's aimed at EAs specifically. It's possible that can be done, but I think it's quite a high bar. (Which is not to say EtG advice couldn't be improved in ways that make future scandals less likely.)
You're right! It's not that ETG is inherently bad (and frankly, I haven't seen anyone make this argument), it's that specific EV-maximising interpretations of ETG cause people to pursue careers that are (1) harmful, (2) net harmful, or (3) too risky to pay off.
Personally, I think FTX was (1) and (3), and unlikely to be (2) probably also (2). I'm not really sure where the bar is, but under any moderately deontological framework (1) is especially concerning, and many of the people EA might want to have a good reputation with believe (1). So that's roughly the worldview-neutral case for caring about strongly rejecting EV-maximising forms of ETG.
I should probably stop posting on this or reading the comments, for the sake of my mental health (I mean that literally, this is a major anxiety disorder trigger for me.) But I guess I sort of have to respond to a direct request for sources.
Scott's official position on this is agnosticism, rather than public endorsement*. (See here for official agnosticism: https://www.astralcodexten.com/p/book-review-the-cult-of-smart)
However, for years at SSC he put the dreaded neo-reactionaries on his blogroll. And they are definitely race/IQ guys. Meanwhile... (read more)