My previous Getting into Machine Learning post is one of my most popular; since then much has changed.
There's a new kid on the block: JAXcache. A thin, but powerful layer over Autograd and XLA, it makes it easy to concisely express algorithms with the same syntax as numpy while getting the full performance of TPUs and GPUs.
The resources I recommended in my previous post are still relevant; on top of that there's plenty of new material. Some of my favorites include:
- Dive into Deep Learning, an interactive deep learning book
- OpenAI's Spinning Up in Deep RLcache covers a set of different algorithms
- Distillcache, an interactive online journal with great visualizations