IK

Isaac King

280 karmaJoined Sep 2021

Comments
40

Creating identical copies of people is not claimed to sum to less moral worth than one person. It's claimed to sum to no more than one person. Torturing one person is still quite bad.

Downvoting as you seem to have not read or chosen to ignore the first section; I explain in that section why it would matter less to torture a copy. I can't meaningfully respond to criticisms that don't engage with the argument I presented.

Probably, yeah. But that seems hard to square with a consistent theory of moral value, given that there's a continuum between "good" and "bad" experiences.

I would add to #2 that the number of shrimp being farmed is equally if not more relevant than brain size. The total number of experiences is surely still quite large in normal human terms, but could be small relative to the massive numbers of shrimp in existence.

I didn't mean it to be evidence for the statement, just an explanation of what I meant by the phrase.

Do you disagree that most people value that? My impression is that wireheading and hedonium are widely seen as undesirable.

Yeah, I don't do it on any non-LW/EAF post.

Yeah, most of the p(doom) discussions I see taking place seem to be focusing on the nearer term of 10 years or less. I believe there are quite a few people (e.g. Gary Marcus, maybe?) who operate under a framework like "current LLMs will not get to AGI, but actual AGI will probably be hard to align), so they may give a high p(doom before 2100) and a low p(doom before 2030).

Oh, I agree. Arguments of the form "bad things are theoretically possible, therefore we should worry" are bad and shouldn't be used. But "bad things are likely" is fine, and seems more likely to reach an average person than "bad things are 50% likely".

Isn't that what the strong upvote is for?

Load more