A

Andaro

-44 karmaJoined Nov 2017

Comments
15

I would beware the political backlash and retaliation costs from #2. What you are classifying as "ethical flaws" is actually about agenda.

In a representative democracy, government spending is supposed to be allocated according to the best interests of tax payers, voters, and citizens. Of course those are human beings living in the relatively present time with citizenship in the respective country. Trying to game the system so that it starts allocating those resources differently is not fixing an ethical flaw, it's a shift in agenda that does not match the principle of representation.

You may not care about that, but you should care about the political and social backlash EA will deservedly get if it undermines our best interests as voters, taxpayers and citizens of the countries you are trying to coopt.

Read free stories online. The biggest cost is the effort to find the best 10% among all existing free stories. But those are very much worth reading and you can spend countless hours quite entertained, effectively free of cost.

Yet preventing such cases should not be lexically prior to any other consideration: we should be willing to gamble utopia against extinction at the chance of a single terrible life of 1/TREE(9).

I disagree; it is lexically, deontologically more important not to cause an innocent rape or nc torture victim than to cause any amount of happiness or utopian gain for others; also the number is absurd, terrible lives in the millions are a stochastical inevitability even just on Earth within each generation. Just look at the attempted suicide rates.

Statistical outliers say life, even in the historically propitious circumstances of the affluent west, is not good for them. Their guardian angels shouldn't actualize them. Yet uncertainty over this, given the low base-rates of this being the case, doesn't give them right of veto across the innumerable multitudes who could rejoice in an actual future.

I disagree; the right not to be tortured or raped without one's consent is lexically more morally important than the interest of others to rejoice in a good future. Rape doesn't become moral even if enough spectators enjoy the rape video; nc torture doesn't become moral even if enough others rejoice in the knowledge of the torture. Victimizing nc innocents in this way is not morally redeemable by the creation of utopias populated by lucky others. There is no knowledge that our descendants could discover that would change this.

I often read rape and torture scenes in fiction - you could also watch Game of Thrones for the same effect - and while I enjoy the reading, I am often horrified by the thought that equivalents are real. If you want a good example, read this. (content warning: rape and torture, obviously). Now, I love these story as much as the next guy, but they also make me reflect: If I could choose to create a universe where this happens once and also intergalacitc utopias filled with happy life exist, or a universe that is empty, I would choose the universe that is empty. And I think it's utterly morally absurd to choose otherwise. It's churched-up evil.

Of course, you don't have to look for fiction, just remember that actual nc child torture is still legal in the US, the UK, and France, among other countries. Or read the piece about North Korea on this forum. Humanity has no redeeming qualities that could morally justify the physical reality of these systems. It never will.

Similar to the above, myself (and basically everyone else) take our futures to be worth living for on selfish grounds

I don't. Plus, for those who see it your way, it's consensual (though not necessarily rational). Those who disagree, are of course victimized by the anti-suicide religionists and their anti-choice laws. It's not like people have an actual right to exit from this shitshow.

Humanity's quantitative track record is obviously upward (e.g. life expectancy, child mortality, disease rates, DALY rates, etc.).

This can turn around as per-capita incomes fall, which inevitably happens in a Malthusian scenario. And Malthusian scenarios are not outlier probability scenarios, but expected with high (mainstream) probability, because any fast reproduction technology without global centralized suppression predicts a near-inevitable Malthusian outcome (any fast reproduction tech, not just ems).

Moral progress is not a robust law of nature, but could be contingent on other factors that can turn around, or it could simply be a random walk with reversals to the mean to be expected, combined with distortions of perception (any generation will consider its values superior to prior generations and therefore see moral progress, no matter what directions the values actually took or why).

If it turns out that the only thing that makes things good is happiness, we can tile the universe in computronium and simulate ecstasy (which should give amounts of pleasure to pain over the universe's history not '10% higher', but more like 10^10:1, even with extreme trade-off ratios).

Several problems here. (1) the numbers are absurdly overoptimistic, you assume lots of hedonium with near-zero torture. Hedonium doesn't carry its own economic weight and the future will likely be dominated by Malthusian replicators who are not optimized for ecstasy, but competitive success in replication,

(2) you assume our descendants will be rational moral beings who implement our idealized moral values (far mode), when in reality they will almost certainly be constrained by intense competitive pressures and implement selfish incentives (near mode); they would use victimization as a means to an end just as likely as current people are to eat factory-farmed meat; indeed value drift makes it even more likely that they won't share our already-meager humane values, e.g. their altered psychology may have optimized empathy and justice instincts out completely.

Maybe what's really going on here is you're making a bid for status by accusing others of being status seeking

Hahahaha. I'm at -12 karma because I wrote what I think instead of what people here want to hear. And I knew well in advance that this would happen. If I wanted status, I'd join a group in person and give lip-service to the community dogma. Probably the Catholic church, then I could sing hallelujah all day long and scoff at those filthy atheists while covertly grooming young girls for sexual use. And you know what, I'd probably be happier that way. Problem is, I'm not a good enough liar, and I despise gullible people far too much to play the pretend game.

critique of negative utilitarianism

Except I never argued for Negative Utilitarianism. Misrepresenting the arguments I made as such is a complete strawman.

For example, I don't believe there's a moral reason to prevent people who want pain and consent to it, from having pain.

Neither do I believe that there's a moral reason to prevent suffering for the guilty who have forced it on nonconsenting innocents. You, for example, have actively worked to cause it for a very large number of innocent nc victims, and therefore I do not believe there is a moral reason to prevent your suffering or victimization, even if it is nc.

It appears I was downvoted to -10 karma by people who didn't even read my posts.

Nice exercise in goalpost-moving, kbog.

Look dude

Errrr, no.

(This is a long comment. Only the first four paragraphs are in direct response to you. The rest is still true and relevant, but more general. I don't expect a response.)

Childbirth is not an act of self-sacrifice. It never was. There was not even one altruistic childbirth in all of history. It was either involuntary for the female (vast majority) or self-serving (females wanting to have children, to bind a male in commitment, or to get on the good side of the guy who can and will literally burn you alive forever).

I'm not saying there is never any heroism if the hero can harvest the status and material advantages from it. But if they can discreetly omit it and there's no such external reward, motivation in practice does look slim indeed.

Even if you're a statistical outlier, consider the possiblity that you'd be saving a large ethical negative, which is a tragic mistake rather than a good thing.

If you personally would be willing to pre-commit, that's at least some form of consent. In contrast, the actual victimization in the future is largely going to be forced on nonconsenting victims. There's a moral difference. It's hard to come up with something even in principle that could justify that.

Not to mention humanity's quantitative track record is utterly horrible. Some improvements have been made, but it's still completely irredeemable overall. Politics is a disgusting, vile shitshow, with top leaders like the POTUS openly glorifying torture-blackmail.

Seriously, I have never seen an x-risk reducer paint a realistic vision of the future, outline its positives without handwaving, stay honest and within the realm of probable outcomes, so that a sane person could look at it and say, "Okay, that really is worth torturing quintillions of nc victims in the worst ways possible."

If they can be bothered to address it at all, you'll find mostly handwaving, e.g. Derek Parfit in his last publication dismissing the concern with one sentence about how "our successors would be able to prevent most human suffering". It's the closest they've got to an actual defense. Ignoring, of course, that torture is on purpose and technology just makes that more effective. Ignoring also that even if suffering becomes relatively rarer, it will still happen frequently, and space colonization implies a mind-boggling increase in the total.

Ignoring also the more fundamental question why even one innocent nc victim should be tortured for the sake of... what, exactly? Pleasure? Human biomass? Monuments? They never really say. It's not like these people are actually rooting for some specific positive thing that they're willing to put their names on, and then actually optimize that thing.

If Peter Singer came out and said he wants x-risk reduced because he expects 10% more pleasure than pain from it and he'll bite all the utilitarian bullets to get there, advocating to spread optimized pleasure minds rather than humans as much as possible and prevent as much pain as possible by any means necessary, I would understand. I would disagree, but it would be an actual, consistent goal.

But in practice, this usually doesn't happen. X-risk reducers use strategic vagueness instead. The reason for that is rather simple: "Yay us" yields social status points in the tribe, and humanity is the current default tribe for most intellectuals of the internet era. So x-risk reduction advocacy is really just intellectualized "yay us" in the internet era. As long as it is not required, bullets will not be bitten and no specific goals will be given. The true optimization function of course is the advocate's own social status.

That's incorrect.

You can't make a thread saying sexual violence is bad because of suicide, and then not allow people to discuss the consent principle as it pertains to suicide.

If you use "lives saved" numbers that imply involuntary survival is good, then you will get commenters pointing out that this violates the consent principle. You are not immune to criticism.

Don't want to discuss suicde? Then don't bring it up.

The other points crossed some inferential distance, but were both relevant and correct. It really is true that most rape currently happens in nonhuman animals, and that the x-risk reduction efforts implies actively causing a future that contains astronomical amounts of additional rape. This is both true and relevant, even if it goes against the usual euphemistic framing and may therefore sound counterintuitive to you.

This may sound rude, but I don't believe you.

Of course, if you consented, it would be consensual. The actual torture will be nonconsensual.

Load more