Thanks for tracking this, I definitely think being aware of + extremely up-to-date on "novel pandemics" is one of the things that our community (~GCR-focused EAs) should have at least a few people working on, though it's a hard thing to do without either under- or over- shooting.
(It's possible these people already exist and are working quietly. In which case, great! I'm not sufficiently in touch with the space to know either way).
A meta- norm I'd like commentators[1] to have is to Be Kind, When Possible. Some subpoints that might be helpful for enacting what I believe to be the relevant norms:
I do regret using the holocaust example. The example was loosely based on one speaker who appeared to be defending eugenics by saying that the holocaust was actually considered a dysgenic event by top nazi officials
That sounds like an obviously invalid argument! Now, a) I didn't attend that talk, b) many people are bad at making arguments, and c) I've long suspected that poor reasoning especially is positively correlated with racism (and this is true even after typical range restriction). So it's certainly possible that the argument they made was literally that bad.
But I think it's more likely that you misunderstood their argument.
This is a rough draft of questions I'd be interested in asking Ilya et. al re: their new ASI company. It's a subset of questions that I think are important to get right for navigating the safe transition to superhuman AI.
(I'm only ~3-7% that this will reach Ilya or a different cofounder organically, eg because they read LessWrong or from a vanity Google search. If you do know them and want to bring these questions to their attention, I'd appreciate you telling me so I have a chance to polish the questions first)
I'll leave other AGI-safety relevant questions like alignment, evaluations, and short-term race dynamics, to others with greater expertise.
I do not view the questions I ask as ones I'm an expert on either, just one where I perceive relatively few people are "on the ball" so to speak, so hopefully a generalist paying attention to the space can be helpful.
Thanks, edited! :)