Thanks very much for writing this! I'm glad to see more advice of this kind for the EA community.
I recently came across some advice from John Ratey which I haven't been able to verify despite some effort. Quoting from his book Spark:
This extends what we know from the neurogenesis research: that aerobic exercise and complex activity have different beneficial effects on the brain...The evidence isn’t perfect, but really, your regimen has to include skill acquisition and aerobic exercise…choose a sport that simultaneously taxes the cardiovascular system and the brain — tennis is a good example — or do a ten-minute aerobic warm-up before something nonaerobic and skill-based, such as rock climbing or balance drills. While aerobic exercise elevates neurotransmitters, creates new blood vessels that pipe in growth factors, and spawns new cells, complex activities put all that material to use by strengthening and expanding networks.
Is it correct that skill acquisition is an important component of an exercise regimen? I thought this was a weird thing for him to assert given that he later says that there is little “research into the effect of rhythm, balance, and skill-based activites on the brain.” It also seems like there is an inherent tradeoff between complex movement and exercise intensity: you can't play tennis as intensely as you can do intervals when swimming/running/cycling.
Sorry, this was a little unclear! I was thinking of the distinction made here: https://80000hours.org/articles/ai-policy-guide/#ai-policy-researcher
Practitioner: implementing specific policies Researcher: working out which policies are desirable
What are your thoughts on AI policy careers in government? I'm somewhat skeptical, for two main reasons:
1) It's not clear that governments will become leading actors in AI development; by default I expect this not to happen. Unlike with nuclear weapons, governments don't need to become experts in the technology to yield AI-based weapons; they can just purchase them from contractors. Beyond military power, competition between nations is mostly economic. Insofar as AI is an input to this, governments have an incentive to invest in domestic AI firms over government AI capabilities, because this is the more effective way to translate AI into GDP.
2) Government careers in AI policy also look compelling if the intersection of AI and war is crucial. But as you say in the interview, it's not clear that AI is the best lever for reducing existentially damaging war. And in the EA community, it seems like this argument was generated as an additional reason to work on AI, and wasn't the output of research trying to work out the best ways to reduce war.
Do you think the answer to this question should be a higher priority, especially given the growing number of EAs studying things like Security Studies in D.C.?