RL

Robin Larson

0 karmaJoined
0

Comments
2

Assumptions I made when writing this:

1. Either there aren't actually-existing worldviews where the only goals are suicidal, or it's not worth changing "goals" to "non-suicidal goals" just because there are, as it would make it unnecessarily confusing to most audiences.

2. Either we can't influence the heat death (or other ultimate fate of the universe), or we at least can't currently in a way that's comparable to our influence over AI.

3. Either the entire universe is traversable, or all life that we'll ever have evidence of is in a traversable range.

The existence of life is a prerequisite to the goals of all worldviews. AI is the only influenceable force with the potential to wipe out all life in the universe. Therefore, AI safety is the only thing we can influence that is important to all living things with worldviews that have goals.