TL;DR Having a good research track record is some evidence of good big-picture takes, but it's weak evidence. Strategic thinking is hard, and requires different skills. But people often conflate these skills, leading to excessive deference to researchers in the field, without evidence that that person is good at strategic thinking specifically. I certainly try to have good strategic takes, but it's hard, and you shouldn't assume I succeed!
Introduction
I often find myself giving talks or Q&As about mechanistic interpretability research. But inevitably, I'll get questions about the big picture: "What's the theory of change for interpretability?", "Is this really going to help with alignment?", "Does any of this matter if we can’t ensure all labs take alignment seriously?". And I think people take my answers to these way too seriously.
These are great questions, and I'm happy to try answering them. But I've noticed a bit of a pathology: people seem to assume that because I'm (hopefully!) good at the research, I'm automatically well-qualified to answer these broader strategic questions. I think this is a mistake, a form of undue deference that is both incorrect and unhelpful. I certainly try to have good strategic takes, and I think this makes me better at my job, but this is far from sufficient. Being good at research and being good at high level strategic thinking are just fairly different skillsets!
But isn’t someone being good at research strong evidence they’re also good at strategic thinking? I personally think it’s moderate evidence, but far from sufficient. One key factor is that a very hard part of strategic thinking is the lack of feedback. Your reasoning about confusing long-term factors need to extrapolate from past trends and make analogies from things you do understand better, and it can be quite hard to tell if what you're saying is complete bullshit or not. In an empirical science like mechanistic interpretability, however, you can get a lot more fe
Why isn't anyone talking about the Israel-Gaza situation much on the EA Forum? I know it's a big time for AI, but I just read that number of Palestinian deaths, the vast majority of whom are innocent people, and 65% are women and children, is approaching the level of civilians killed in Ukraine since the Russian invasion 21 months ago; just in the last 3-4 weeks.
The Israel-Gaza situation doesn't strike me as very neglected or tractable. The eyes of much of the world are on that situation, and it's not clear to me that EA actors have much to add to the broader conversation. It's also not clear to me why we would expect that actions that EA actors could take would be expected to have a significant impact on the situation.
This is a non-comprehensive and lightly held view -- for example, it focuses on intellectual and financial contributions. I'd be interested in hearing why you think the Israel-Gaza situation is more neglected and tractable than I tentatively think it is.
That area is controlled by militaries, who might retaliate against people who find clever ways to smuggle aid into the conflict zone. So trying to help people there instead of elsewhere is the wrong move.
EA was probably wrong to prioritize bednets over a malaria vaccine, even though lots of children would have died horribly if a malaria vaccine was invented 5 years later instead of them getting bednets now. It might seem harsh, but saving fewer lives instead of more is even more harsh for the lives of the people themselves, even if it's accidental.
Hm what would you expect/hope people discuss about it here? As far as I remember, people didn't talk much about the Ukraine-Russia war either. Probably because there's not much that most EAs (or people in general) can do about it (not tractable) + not something that people aren't discussing enough (not neglected).