How important is visualization w.r.t. the interpretability (or broader alignment) problem? Specifically, is there need+opportunity for impact of frontend engineers in that space?
I’ve got 10 years of experience in software engineering, most of which has been on frontend data visualization stuff, currently at Google (previously at Microsoft). I looked around at some different teams within Google and saw Tensorboard and the Learning Interpretability Tool, but it’s unclear to me how much those teams are bottlenecked by visualization implementation problems vs research problems of knowing where/how to even look, and I’d like to have more background before I cold-call them directly
I've started to get burned out by the earning to give path and am currently considering semi-retirement to focus on other pursuits, but if there’s somewhere I can contribute to alignment without needing to go back for a PhD that would be perfect (I have been eagerly studying ML on the side though)