anonymous_ea

1214Joined Sep 2018

Comments
173

It makes a lot of difference to me that Charles' behavior was consistently getting better. If someone consistently flouts norms without any improvement, at some point they should be indefinitely banned. This is not the case with Charles. He started off with really high variance and at this point has reached a pretty tolerable amount. He has clearly worked on his actions. The comments he posted while flouting the mods' authority generally contributed to the conversation. There are other people who have done worse things without action from the mod team. Giving him a 10 year ban without appeal for this feels more motivated by another instance of the mod team asserting their authority and deciding not to deal with messiness someone is causing than a principled decision. 

Various comments made by this user in multiple posts some time ago, some of which received warnings by mods but nothing beyond that. 

I find this reflects worse on the mod team than Charles. This is nowhere near the first time I've felt this way. 

Fundamentally, it seems the mod team heavily prioritizes civility and following shallow norms above enabling important discourse. The post on forum norms says a picture of geese all flying in formation and in one direction is the desirable state of the forum; I disagree that this is desirable. Healthy conflict is necessary to sustain a healthy community. Conflict sometimes entails rudeness. Some rudeness here and there is not a big deal and does not need to be stamped out entirely. This also applies to the people who get banned for criticizing EA rudely, even when they're criticizing EA for its role in one of the great frauds of modern history. Banning EA critics for minor reasons is a short-sighted move at best. 

Banning Charles for 10 years (!!) for the relatively small crime of evading a previous ban is a seriously flawed idea. Some of his past actions like doxxing someone (without any malice I believe) are problematic and need to be addressed, but do not deserve a 10 year ban. Some of his past comments, especially farther in the past, have been frustrating and net-negative to me, but these negative actions are not unrelated to some of his positive traits, like his willingness to step out of EA norms and communicate clearly rather than like an EA bot. The variance of his comments has steadily decreased over time. Some of his comments are even moderator-like, such as when he warned EA forum users not to downvote a WSJ journalist who wasn't breaking any rules. I note that the mod team did not step in there to encourage forum norms. 

I also find it very troubling that the mod team has consistent and strong biases in how it enforces its norms and rules, such as not taking any meaningful action against an EA in-group member for repeated and harmful violations of norms but banning an EA critic for 20 years for probably relatively minor and harmless violations. I don't believe Charles would have received a similar ban if he was an employee of a brand name EA org or was in the right social circles. 

Finally, as Charles notes, there should be an appeals process for bans. 

Sorry, I'm not sure I understand what your point is. Are you saying that my point 1 is misleading because having even any relevant experience can be a big boost for an applicant's chances to getting hired by CEA, and any relevant experience isn't a high bar? 

It sounds like there are two, separate things going on:

  1. Jobs at CEA are very hard to get, even for candidates with impressive resumes overall.
  2. CEA finds it hard to get applicants that have particular desirable qualities like previous experience in the same role. 

(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)

Yes, agreed. 

Let me lay out my thinking in more detail. I mean this to explain my views in more detail, not as an attempt to persuade. 

Paul's account of Aaronson's view says that Eliezer shouldn't be as confident in MWI as he is, which in words sounds exactly like my point, and similar to Aaronson's stack exchange answer. But it still leaves open the question of how overconfident he was, and what, if anything, should be taken away from this. It's possible that there's a version of my point which is true but is also uninteresting or trivial (who cares if Yudkowsky was 10% too confident about MWI 15 years ago?). 

And it's worth reiterating that a lot of people give Eliezer credit for his writing on QM, including for being forceful in his views. I have no desire to argue against this. I had hoped to sidestep discussing this entirely since I consider it to be a separate point, but perhaps this was unfair and led to miscommunication. If someone wants to write a detailed comment/post explaining why Yudkowsky deserves a lot of credit for his QM writing, including credit for how forceful he was at times, I would be happy to read it and would likely upvote/strong upvote it depending on quality. 

However, here my intention was to focus on the overconfidence aspect. 

I'll explain what I see as the epistemic mistakes Eliezer likely made to end up in an overconfident state. Why do I think Eliezer was overconfident on MWI? 

(Some of the following may be wrong.)  

  • He didn't understand non-MWI-extremist views, which should have rationally limited his confidence
    • I don't have sources for this, but I think something like this is true.
    • This was an avoidable mistake
    • Worth noting that Eliezer has updated towards the competence of elites in science since some of his early writing according to Rob's comment elsewhere this thread
  • It's possible that his technical understanding was uneven. This should also have limited his confidence.
    • Aaronson praised him for "actually get most of the technical stuff right", which of course implies that not everything technical was correct.
    • He also suggested a specific, technical flaw in Yudkowsky's understanding.
    • One big problem with having extreme conclusions based on uneven technical understanding is that you don't know what you don't know. And in fact Aaronson suggests a mistake Yudkowsky seems unaware of as a reason why Yudkowsky's central argument is overstated/why Yudkowsky is overconfident about MWI.
    • However, it's unclear how true/important a point this really is
  • At least 4 points limit confidence in P(MWI) to some degree:
    • Lack of experimental evidence
    • The possibility of QM getting overturned
    • The possibility of a new and better interpretation in the future
    • Unknown unknowns
    • I believe most or all of these are valid, commonly brought up points that together limit how confident anyone can be in P(MWI). Reasonable people may disagree with their weighting of course.
    • I am skeptical that Eliezer correctly accounted for these factors

Note that these are all points about the epistemic position Eliezer was in, not about the correctness of MWI. The first two are particular to him, and the last one applies to everyone. 

Now, Rob points out that maybe the heliocentrism example is lacking context in some way (I find it a very compelling example of a super overconfident mistake if it's not). Personally I think there are at least a couple[1] [2] of places in the sequences where Yudkowsky clearly says something that I think indicates ridiculous overconfidence tied to epistemic mistakes, but to be honest I'm not excited to argue about whether some of his language 15 years ago was or wasn't overzealous. 

The reason I brought this up despite it being a pretty minor point is because I think it's part of a general pattern of Eliezer being overconfident in his views and overstating them. I am curious how much people actually disagree with this. 

Of course, whether Eliezer has a tendency to be overconfident and overstate his views is only one small data point among very many others in evaluating p(doom), the value of listening to Eliezer's views, etc. 

  1. ^

    "Many-worlds is an obvious fact, if you have all your marbles lined up correctly (understand very basic quantum physics, know the formal probability theory of Occam’s Razor, understand Special Relativity, etc.)"

  2. ^

    "The only question now is how long it will take for the people of this world to update." Both quotes from https://www.lesswrong.com/s/Kqs6GR7F5xziuSyGZ/p/S8ysHqeRGuySPttrS

I’m trying to make sense of why you’re bringing up “overconfidence” here. The only thing I can think of is that you think that maybe there is simply not enough information to figure out whether MWI is right or wrong (not even for even an ideal reasoner with a brain the size of Jupiter and a billion years to ponder the topic), and therefore saying “MWI is unambiguously correct” is “overconfident”?

Here's my point: There is a rational limit to the amount of confidence one can have in MWI (or any belief). I don't know where exactly this limit is for MWI-extremism but Yudkowsky clearly exceeded it sometimes. To use made up numbers, suppose: 

  • MWI is objectively correct
  • Eliezer says P(MWI is correct) = 0.9999999
  • But rationally one can only reach P(MWI) = 0.999
    • Because there are remaining uncertainties that cannot be eliminated through superior thinking and careful consideration, such lack of experimental evidence, the possibility of QM getting overturned, the possibility of a new and better interpretation in the future, and unknown unknowns.
    • These factors add up to at least P(Not MWI) = 0.001.

Then even though Eliezer is correct about MWI being correct, he is still significantly overconfident in his belief about it. 

Consider Paul's example of Eliezer saying MWI is comparable to heliocentrism:

If we are deeply wrong about physics, then I [Paul Christiano] think this could go either way. And it still seems quite plausible that we are deeply wrong about physics in one way or another (even if not in any particular way). So I think it's wrong to compare many-worlds to heliocentrism (as Eliezer has done). Heliocentrism is extraordinarily likely even if we are completely wrong about physics---direct observation of the solar system really is a much stronger form of evidence than a priori reasoning about the existence of other worlds. 

I agree with Paul here. Heliocentrism is vastly more likely than any particular interpretation of quantum mechanics, and Eliezer was wrong to have made this comparison. 

This may sound like I'm nitpicking, but I think it fits into a pattern of Eliezer making dramatic and overconfident pronouncements, and it's relevant information for people to consider e.g. when evaluating Eliezer's belief that p(doom) = ~1 and the AI safety situation is so hopeless that the only thing left is to die with slightly more dignity. 

Of course, it's far from the only relevant data point. 

Regarding (2), I think we're on the same page haha. 

When I said it was relevant to his track record as a public intellectual, I was referring to his tendency to make dramatic and overconfident pronouncements (which Ben mentioned in the parent comment). I wasn't intending to imply that the debate around QM had been settled or that new information had come out. I do think that even at the time Eliezer's positions on both MWI and why people disagreed with him on it were overconfident though. 

I think you're right that my comment gave too little credit to Eliezer, and possibly misleadingly implied that Eliezer is the only one who holds some kind of extreme MWI or anti-collapse view or that such views are not or cannot be reasonable (especially anti-collapse). I said that MWI is a leading candidate but that's still probably underselling how many super pro-MWI positions there are. I expanded on this in another comment.  

Your story of Eliezer comparing MWI to heliocentrism is a central example of what I'm talking about. It is not that his underlying position is wrong or even unlikely, but that he is significantly overconfident. 

I think this is relevant information for people trying to understand Eliezer's recent writings. 

To be clear, I don't think it's a particularly important example, and there is a lot of other more important information than whether Eliezer overestimated the case for MWI to some degree while also displaying impressive understanding of physics and possibly/probably being right about MWI. 

I agree that: Yudkowsky has an impressive understanding of physics for a layman, in some situations his understanding is on par with or exceeds some experts, and he has written explanations of technical topics that even some experts like and find impressive. This includes not just you, but also e.g. Scott Aaronson, who praised his series on QM in the same answer I excerpted above, calling it entertaining, enjoyable, and getting the technical stuff mostly right. He also praised it for its conceptual goals. I don't believe this is faint praise, especially given stereotypes of amateurs writing about physics. This is a positive part of Yudkowsky's track record. I think my comment sounds more negative about Yudkowsky's QM sequence than it deserves, so thanks for pushing back on that. 

I'm not sure what you mean when you call yourself a pro-MWI extremist but in any case AFAIK there are physicists, including one or more prominent ones, who think MWI is really the only explanation that makes sense, although there are obviously degrees in how fervently one can hold this position and Yudkowsky seems at the extreme end of the scale in some of his writings. And he is far from the only one who thinks Copenhagen is ridiculous. These two parts of Yudkowsky's position on MWI are not without parallel within professional physicists, and the point about Copenhagen being ridiculous is probably a point in his favor from most views (e.g. Nobel laureate Murray Gell-Mann said that Neils Bohr brainwashed people into Copenhagen), let alone this community. Perhaps I should have clarified this in my comment, although I did say that MWI is a leading interpretation and may well be correct. 

The negative aspects I said in my comment were:

  1. Yudkowsky's confidence in MWI is disproportionate
  2. Yudkowsky's conviction that people who disagree with him are making elementary mistakes is disproportionate
  3. These may come partly from a lack of knowledge or expertise

Maybe (3) is a little unfair, or sounds harsher than I meant it. It's a bit unclear to me how seriously to take Aaronson's quote. It seems like plenty of physicists have looked through the sequences to find glaring flaws, and basically found none (physics stackexchange). This is a nontrivial achievement in context. At the same time I expect most of the scrutiny has been to a relatively shallow level, partly because Yudkowsky is a polarizing writer. Aaronson is probably one of fairly few people who have deep technical expertise and have read the sequences with both enjoyment and a critical eye. Aaronson suggested a specific, technical flaw that may be partly responsible for Yudkowsky holding an extreme position with overconfidence and misunderstanding what people who disagree with him think. Probably this is a flaw Yudkowsky would not have made if he had worked with a professional physicist or something. But maybe Aaronson was just casually speculating and maybe this doesn't matter too much. I don't know. Possibly you are right to push back on the mixed states explanation. 

I think (1) and (2) are well worth considering though. The argument here is not that his position is necessarily wrong or impossible, but that it is overconfident. I am not courageous enough to argue for this position to a physicist who holds some kind of extreme pro-MWI view, but I think this is a reasonable view and there's a good chance (1) and (2) are correct. It also fits in Ben's point 4 in the comment above: "Yudkowsky’s track record suggests a substantial bias toward dramatic and overconfident predictions." 

For convenience, this is CEA's statement from three years ago:

We approached Jacy about our concerns about his behavior after receiving reports from several parties about concerns over several time periods, and we discussed this public statement with him. We have not been able to discuss details of most of these concerns in order to protect the confidentiality of the people who raised them, but we find the reports credible and concerning. It’s very important to CEA that EA be a community where people are treated with fairness and respect. If you’ve experienced problems in the EA community, we want to help. Julia Wise serves as a contact person for the community, and you can always bring concerns to her confidentially.

By my reading, the information about the reports contained in this is:

  • CEA received reports from several parties about concerns over Jacy's behavior over several time periods
  • CEA found the reports 'credible and concerning'
  • CEA cannot discuss details of most of these concerns because the people who raised them want to protect their confidentiality
  • It also implies that Jacy did not treat people with fairness and respect in the reported incidents
    • 'It’s very important to CEA that EA be a community where people are treated with fairness and respect' - why say this unless it's applicable to this case?

Julia also said in a comment at the time that the reports were from members of the animal advocacy and EA communities, and CEA decided to approach Jacy primarily because of these rather than the Brown case:

The accusation of sexual misconduct at Brown is one of the things that worried us at CEA. But we approached Jacy primarily out of concern about other more recent reports from members of the animal advocacy and EA communities. 

Load More