4 comments, sorted by Click to highlight new comments since: Today at 5:07 AM
New Comment

(How to independent study) 

Stephen Casper (https://stephencasper.com/) was giving advice today in how to upskill in research, and suggested doing a "deep dive". 

Deep dive: read 40-50 papers in a specific research area you're interested in going into (e.g. adversarial examples in deep NNs). Take notes on each paper. You'll then have comparable knowledge  to people working in the area, after which you do a synthesis project at the end where you write something up (could be lit review, could be more original than that). 

He said he'd trade any class he'd ever taken for one of these deep dives, and they're worth doing even if it takes like 4 months.  

*cool idea

This sounds like a great idea and aligns with my growing belief that classes are, more often than not, far from the best way to learn.

I think classes are great given they're targeting something you want to learn, and you're not uncommonly self-motivated. They add a lot of structure and force engagement (i.e. homework, problem sets) in a way that's hard to find time / energy for by yourself. You also get a fair amount of guidance and scaffolding information, plus information presented in a pedagogical order!  With a lot of variance due to the skill and time investment of the instructor, size of class and quality of the curriculum etc. 

But if you DO happen to be very self-driven, know what you want to learn, and if in a research context if you're the type of person who is capable of generating novel insights without much guidance,  then heck yes classes are inefficient. Even if you're not all of these things, it certainly seems worth trying to see if you can be, since self-learning is so accessible and one learns a lot by being focusedly confused. I like how neatly presented the above deep dives idea is: it feels like it gives me enough structure to have a handle on it and makes it feel unusually feasible to do. 

But yeah, for the people who are best at deep dives, I imagine it's hard for any class to match, even with how high-variance classes can be :). 

Update on my post "Seeking social science students / collaborators interested in AI existential risks" from ~1.5 months ago: 

I've been running a two-month "program" with eight of the students who reached out to me! We've come up with research questions from my original list, and the expectation is that individuals work 9h/week as volunteer research assistants. I've been meeting with each person / group for 30min per week to discuss progress. We're halfway through this experiment, with a variety of projects and progress states-- hopefully you'll see at least one EA Forum post up from those students! 

I was quite surprised by the interest that this post generated; ~30 people reached out to me, and a large number were willing to do a volunteer research for no credit / pay. I ended up working with eight students, mostly based on their willingness to work with me on some of my short-listed projects. I was willing to have their projects drift significantly from my original list if the students were enthusiastic and the project felt decently aligned with risks from long-term AI, and that did occur. My goal here was to get some experience training students who had limited research experience, and I've been enjoying working with them. 

I'm not sure about how likely it is I'll continue working with students past this 2-month program, because it does take up a chunk of time (that's made worse by trying to wrangle schedules), but I'm considering what to do for the future. If anyone's interested in also mentoring students with an interest in longterm risks from AI, please let me know, since I think there's interest! It's a decently low time commitment (30m/student or group of students) once you've got everything sorted. However, I am doing it for the benefit of the students, rather than with the expectation of getting help on my work, so it's more of a volunteer role.