I'm looking for a way to make sure I reliably learn about the biggest developments such as Transformers, AlphaFold, or the grokking paper. I don't currently want to spend too much time on this, so optimal frequency would probably be monthly, though weekly is fine as well.
If you have other mechanisms of staying up to date with machine learning, I'd be curious to hear about those as well.
The Transformers paper (Attention is All You Need) was only a poster at NIPS 2017 (not even a spotlight let alone an oral presentation). I don’t know if anyone at the time predicted the impact it would have.
It’s hard to imagine a newsletter that could have picked out that paper at the time as among the most important of the hundreds included. For comparison, I think probably that at the time, there was much more hype and discussion of Hinton and students’ capsule nets (also had a NIPS 2017 paper).
I think this is generally true of ML research. It’s usually very hard to predict impact in advance. You could probably do pretty well with 6 months to a year lag though.
I will recommend the TWIML podcast which interviews a range of good researchers, but not only on the biggest stuff.
Wow, that certainly is more “attention” than I remember at the time.
I think filtering on that level of hype alone would still leave you reading way too many papers.
But I can see that it might be more plausible for someone with good judgment + finger on the pulse to do a decent job predicting what will matter (although then maybe that person should be doing research themselves).