Hi, I'm an undergrad at Cornell University, and I'm an officer of an organization called ACSU (Association of Computer Science Undergraduates). As an officer, I'm required to come up with and organize at least one event every semester. I was thinking this could be a good opportunity to try to influence some CS majors to pursue AI safety. This event would be advertised to everyone on ACSU's mailing list, which is pretty much the general population of undergrad CS majors. I'm looking for advice as to what kind of event might be most effective at this.
My current top idea is to start a reading group with Brian Christian's book The Alignment Problem. I would probably make this a weekly event where we read one or two chapters of the book per week and discuss it (probably with me as the discussion leader every week).
I particularly like that this book "leads up" to the stranger aspects of AI alignment, starting with biases in past and modern AI systems and ending with X-risk stuff. But the biggest perk of this idea is that I would probably be able to get everyone a free copy of the book, since I am also the president of the Cornell Effective Altruism club and am already getting funding from CEA to buy copies of Doing Good Better and The Precipice for my club members. Even if a lot of people don't come to the discussion meetings, advertising a book totally for free might get a large number of people to sign up.
The big con is that this might seem like too big of a time commitment. I read often and pretty quickly, but I'm not sure whether the general undergraduate CS population at Cornell would be willing to do this. We have around 11 weeks left in the school year, so this might be enough for one major chapter a week for 9 weeks. I could instead run a one-off event where we go over some of the ideas of The Alignment Problem in condensed form, but I don't know if such a short introduction would be able to go into sufficient depth. Maybe I could do a one-off event this semester and a reading group next semester?
Please let me know your thoughts/any other ideas you might have! I'm also wondering, does anyone know of an existing guide or curriculum for leading a discussion group of The Alignment Problem?