All of Giga's Comments + Replies

I have not seen that, but I will check it out.

As for the existential threat, it is for a few reasons, I will make a more detailed post about it later. First off, I believe very few things are existential threats to humanity itself. Humans are extremely resilient and live in every nook and cranny on earth. Even total nuclear war would have plenty of survivors. As far as I see it, only an asteroid or aliens could wipe us out unexpectedly. AI could wipe out humanity, however I believe it would be a voluntary extinction in that case. Future humans may believe ... (read more)

Just because something is difficult, doesn't mean it isn't worth trying to do, or at least trying to learn more about so you have some sense of what to do. Calling something "unknowable" -- when the penalty for not knowing it "civilization might end with unknown probability" -- is a claim that should be challenged vociferously, because if it turns out to be wrong in any aspect, that's very important for us to know.

I cannot imagine future humans being so stupid as to have AI connected to the internet and a robot army able to be hijacked by said AI at the sa

... (read more)