H

HedonicTreader

9 karmaJoined Sep 2015

Comments
4

banning the most toxic pesticides, and having better controls for pesticides like centralized storage

I applaud your goal of preventing painful suicide attempts, but I think your approach is wrong-headed. If you ban pesticides, there will still be other bad methods available, e.g. CO poisoning.

My suggestion would be to make pentobarbital available to everybody who wants to die, perhaps with a short waiting period so they are forced to think it through. But of course, that's not politically feasible.

Your approach of banning and restricting doesn't actually add value to anybody's life. It doesn't make their lives better. It doesn't fix the reasons they want to die in the first place. It doesn't even pay their rent or put food on the table. It doesn't even remove painful suicide attempts from the demographic.

I would go so far to say that doing nothing is actually better than this naive paternalism. But that may be too harsh. Even so, it certainly doesn't beat GiveDirectly. At least they add actual value to people's lives (other than the paid lobbyists, that is).

(No personal offense intended in this post)

[This comment is no longer endorsed by its author]Reply

Concept of suffering != experience of suffering.

Human babies don't have such concepts either, but experience of suffering is still realistic.

Micheal, I like your blog and enjoyed the post.

I agree there are no good charities for hedonistic utilitarians at the moment, because they are either not very aligned with hedonistic utilitarian goals or their cost-effectiveness is not tractable. (You can still donate if you have so much money that your alternative spending would be "bigger car/yacht", otherwise it doesn't make much sense.)

Your ideas are all interesting, but values spreading and promoting universal eudaimonia are non-starters. You get downvoted on an EA forum, and you are not going to find a more open-minded amicable target group than this.

Happy animals are problematic because their feedback is limited; you don't know when they are suffering unless you monitor them with unreasonable effort. Their minds are not optimized for high pleasure/low suffering. Perhaps with future technology this sort of thing will be trivial, but that is not certain and investing in the necessary research will give too much harmful knowledge to non-value-aligned people. Even if it were net good to fund such research, it will probably be done for other reasons anyway (commercial applications, publicly funded neurology etc.), so again it's something you should only fund if you have too much money.

I don't know enough about insect biology to judge humane insecticides; the idea is certainly not unrealistic. But remember real people would have to use it preferentially, so even if such a charity existed, there's no guarantee anyone would use it instead of laughing you out of the room.

Lila, the future may not be controlled by a singleton but by a plurality of people implementing diverse values. But even if it is, the singleton may not maximize one value, but a mix of different values different people care about - a compromise "value handshake" as Scott Alexander called it.

Thus, it is best to emphasize you are not paperclip minimizers. The same goes for hedonium, unaided scientific insight, longevity, or art, to name just a few things some transhumanists value while others don't.

There are two kinds of value conflicts: Ones where the values are merely orthogonal, and ones where values are either diametrically opposed or at least strongly negatively correlated in practice.

The orthogonal ones are still in conflict when limited resources are concerned - but not otherwise. It is much easier to find a compromise between them than between the opposed or practically negatively correlated ones.

There is no reason why a sigleton could not spend some resources on paperclips, some on hedonium, some on bigger happy minds, some on Fun, some on art, some on biodiversity, etc., if this increases the probability that people will compromise on letting the singleton come into being and be functional.