D

dain

15 karmaJoined May 2022

Comments
4

The difficulty I have with this argument is where do you draw the line with sentience? And if there's a living thing just below the line, without "real feelings" or interests, but still able to experience pain or other feelings would you not treasure it?

One issue with my post I realise is that maybe by definition you need a sentient being to feel real empathy with, but what I had in mind wasn't strictly just empathy, but caring for or treasuring things.

In a sense it's more of an invitation for a thought experiment to extend our circle of concern regardless of utility. So to answer your question, it's treasuring / appreciating / valuing / finding delight in anything really, just for the mere fact that it came together from cosmic dust. So even if something doesn't have utility for a sentient being, favouring not destroying or harming them.

That being said, of course I'm not saying we should care more about a tuft of grass over a goat for example (and prevent the goat from eating the grass out of concern for the grass's wellbeing) or to put more effort into preserving minerals than farm animal welfare, etc. Instead, as a concrete example, to consider the effects of our (over)consumption in increasing entropy and decreasing natural beauty, even if mining a bare hill without vegetation doesn't impact anything living.

Thanks a lot, really appreciate these pointers!

I'm practically new to AI safety, so reading this post was a pretty intense crash course!

What I'm wondering though, even if we suppose that we can solve all the technical problems to create a completely beneficial, Gaia mother-like AGI which is both super-intelligent and genuinely really wants the best for humanity and the rest of earthlings (or even the whole universe), how can humans themselves even align on:

1. What should be the goals and priorities given limited resources and

2. What should be the reasonable contours of the solution space which isn't going to cause some harm, or since no harm is impossible, what would be acceptable harms for certain gains?

In other words, to my naïve understading it seems like the philosophical questions of what is "good" and what should an AGI even align to is the hardest bit?

I mean, obviously not obliterating life on Earth is a reasonable baseline but feels a bit low ambition? Or maybe this is just a completely different discussion?

Why not start from the other end and work backwards? Why wouldn't we treasure every living being and non-living thing?

Aren't insects (just to react to the article) worthy of protecting as an important part of the food chain (from a utilitarian standpoint), for biodiversity (resilience of the biosphere) or even just simply being? After all, there are numerous articles and studies about their numbers and species declining precipitously, see for example: https://www.theguardian.com/environment/2019/feb/10/plummeting-insect-numbers-threaten-collapse-of-nature

But let's stretch ourselves a bit further! What about non-living things? Why not give a bit more respect to objects, as a start by reducing waste? If we take a longtermist view, there will absolutely not be enough raw materials for people for even 100-200 more years – let alone a 800,000 – with our current (and increasing) global rates of resource extraction.

I'm not saying these should be immediate priorities over human beings, but I really miss these considerations from the article.