It seems if we can't make the basic versions of these tools well aligned with us, we won't have much luck with future more advanced versions.
Therefore, all AI safety people should work on alignment and safety challenges with AI tools that currently have users (image generators, GPT, etc).
Agree? Disagree?
Agree that some could. Since you brought it up, how would you align image generators? They're still dumb tools, do you mean align the users? Add safety features? Stable Diffusion had a few safeguards put in place, but users can easily disable them. Now it's generating typical porn and as well as more dangerous or harmful things, I suspect, but only because people are using it that way, not because it does that on its own. So yeah do you want Stable Diffusion source code to be removed from the web? I second the motion, lol.