I have not seen that, but I will check it out.
As for the existential threat, it is for a few reasons, I will make a more detailed post about it later. First off, I believe very few things are existential threats to humanity itself. Humans are extremely resilient and live in every nook and cranny on earth. Even total nuclear war would have plenty of survivors. As far as I see it, only an asteroid or aliens could wipe us out unexpectedly. AI could wipe out humanity, however I believe it would be a voluntary extinction in that case. Future humans may believe AI has qualia, and is much more efficient at creating utility than biological life. I cannot imagine future humans being so stupid as to have AI connected to the internet and a robot army able to be hijacked by said AI at the same time.
I do believe there is an existential threat to civilization, however it is not present yet, and we will be capable of self-sustaining colonies off Earth by the time is will arise(meaning that space acceleration would be a form of existential threat reduction). Large portions of Africa, and smaller portions of the Americas and Asia are not at a civilizational level that would make a collapse possible, however they will likely cross that threshold this century. If there is a global civilizational collapse, I do not think civilization would ever return. However, there are far too many unknowns as too how to avoid said collapse meaningfully. Want to prevent a civilization ending nuclear war? You could try to bolster the power of the weaker side to force a cold war. Or maybe you want to make the sides more lopsided so intimidation will be enough. However as we do not know which strategy is more effective, and they have opposite praxis, there is no way to know if you would be increasing existential threats or not.
Lastly, most existential threat reduction is political by nature. Politics are also extremely unpredictable, and extremely hard to influence even if you know what you are doing. Politics have incredibly strong driving forces behind them, being nationalism, desperation/fear, corruption, ect, and these driving forces can easily drown out philosophy and the idea of long-term altruism. People want to win before they do good, and largely believe they must win to do the most good.
TLDR: I believe most "existential threats" are not existential or not valid threats, those that do exist have unknowable ways to minimize them, and the political nature of most forms of existential threat reduction make them nearly impossible to influence in the name of long term altruism.