Ben Millwood🔸

4384 karmaJoined

Participation
3

  • Attended an EA Global conference
  • Attended an EAGx conference
  • Attended more than three meetings with a local EA group

Comments
498

Topic contributions
1

Scope insensitivity has some empirical backing -- e.g. the helping birds study -- and some theorised mechanisms of action, e.g. people lacking intuitive understanding of large numbers.

Scope oversensitivity seems possible in theory, but I can't think of any similar empirical or theoretical reasons to think it's actually happening.

To the extent that you disagree, it's not clear to me whether it's because you and I disagree on how EAs weight things like animal suffering, or whether we disagree on how it ought to be weighted. Are you intending to cast doubt on the idea that a problem that is 100x as large is (all else equal) 100x more important, or are you intending to suggest that EAs treat it as more than 100x as important?

While My experience at the controversial Manifest 2024 (and several related posts) was (were) not explicitly about policies or politicians, I think it's largely the underlying political themes that made it so heated.

I have a broad sense that AI safety thinking has evolved a bunch over the years, and I think it would be cool to have a retrospective of "here are some concrete things that used to be pretty central that we now think are either incorrect or at least incorrectly focused"

Of course it's hard enough to get a broad overview of what everyone thinks now, let alone what they used to think but discarded.

(this is probably also useful outside of AI safety, but I think it would be most useful there)

I feel like my experience with notifications has been pretty bad recently – something like, I'll get a few notifications, go follow the link on one, and then all the others will disappear and there's no longer any way to find out what they were. Hard to confidently replicate because I can't generate notifications on demand, but that's my impression.

[edit: it's been changed I think?]

FWIW when I saw the title of this post I assumed you were going to be asking for advice rather than offering it. Something like "My advice on whether it's worth [...]" would be less ambiguous, though a bit clumsier – obv this is partly a stylistic thing and I won't tell you what style is right for you :)

(this point is different enough I decided to make a separate comment for it)

I feel like when people talk about criticism on the Forum, they often point to how it can be very emotionally difficult for the person being criticised, and then I feel like they stop and say "this means there's something wrong with how we do criticism, and we should change it until it's not like this".

I think this is overly optimistic. I find it highly implausible that there's some way we could be, some tone we could set, that would make criticism not hurt. It hurts to be wrong, and it hurts more to hear this from other people, and it hurts more if you're hearing it unexpectedly. These pains are dramatically worsened by hostility or insensitivity or other markers of bad criticism, but even if you do everything you're supposed to in tone and delivery, the truth is going to hurt, and sometimes it's going to hurt a lot.

So, even perfect criticism hurts. Moreover, it's highly implausible that we can aspire to perfect criticism, or a particularly great approximation to it. Anywhere on the forum, people get misread, people fail to make their point clearly, people have tricky and complex ideas that take a lot of digesting to make sense. In criticism, all of that happens in an emotionally volatile environment. It takes a lot of discipline to stay friendly in that context, and I don't think the fact that it sometimes doesn't happen is a uniquely EA failure. No-one anywhere has criticism that stays clean and charitable all the time. If you're thinking "how are we going to ensure that bad ideas don't absorb attention and funding and other resources that could have gone to good ideas", I really struggle to imagine a system that always avoids arguments and hostility, and I think the EA forum honestly does better than the peers I can think of.

We're all here to do things we think are important and high-stakes, involving the suffering of those we care about. It's going to be emotionally fraught. People who write critical comments should try hard to do so in a way that minimises the harm they cause. IMO there should also be more said on the Forum about how people can receive criticism in a way that minimises harm (primarily to them, but perhaps also to others). I do think that "sometimes just ignore the criticism" is good advice, actually. But I don't think we should aspire to "people aren't upset by what is said on the Forum", or "posting about your project on the Forum doesn't make you anxious". Reduce these things as much as possible, but be realistic about how much is possible.

No desire to close down ideas for how to make things better. Please do continue to think and talk about whether criticism on the Forum could be done better. But I want better indicators of disease than "people are hurt when other people tell them they don't like their work".

I think if you experience EA primarily through the forum, you can end up feeling like "wow, EA has a lot of criticism in it, it should have more doing". But there's a strong selection effect: most of the doing is not on the forum, but most of the criticism is. The forum is an appropriate venue for criticism, and usually isn't an appropriate venue for doing.

I'd probably go so far as to say that if I'm writing down a list of the key, essential purposes that the Forum serves, criticism is going to be one of them, probably like... one of the top four, alongside news, socialising, and the pretty small amount of "doing" that can be done through text conversation with other people.

From the other end, if you have a piece of criticism, and you ask "what kind of venue or environment would be ideal for this?" then I think the Forum is not only a place for criticism but usually the place for it, because of the breadth of audience and the shared infrastructure of norms, moderation, and archival / organization (e.g. tags).

So, there's a lot of criticism here, and a lot of the best posts here are criticism. This just seems fine to me, and not an indication that something's wrong. The EA forum is just a part of what the EA community is, and it's naturally always going to be the most criticism-heavy part.

On the other hand, I think there's a lot of people who are attracted by a self-skeptical attitude, having been deeply exasperated by so many alternatives that seem uninterested in the question of whether they are a complete waste of time or not.

obviously there's not really any objective way to settle the matter, but I disagree that criticizers acquire more social capital than doers. When I think of the people who seem to me most prestigious in EA, it's all people who got there by doing things, not by criticising anything.

I do agree that some people with a lot of social capital are seemingly oblivious to how that capital affects the weight of what they say, and I think it's good to point out when this is happening, but the examples I can think of are still people who got that capital by doing things.

concepts which become part of the community have close analogies that have been better studied in academic literature

If they got into the community from the academic literature, this isn't reinventing the wheel, right? At worst it's rebranding the wheel, which feels like a different thing.

For example, is conservation of expected evidence an instance of reinventing the wheel, because this particular name for it is (as far as I know) a LessWrong innovation? I'm sure they (we?) didn't rediscover the theorem from basic principles.

I suppose you might still regard this as a point of criticism insofar as it creates jargon barriers, or insofar as you draw indirect (and IMO tenuous) inferences about a lack of collaboration with the mainstream (i.e. we can only get away with using different words because people who use the "normal" words don't talk to us). But I wouldn't want people drawing from this that LW is unfamiliar with mainstream probability theory.

Load more