My previous Getting into Machine Learning post is one of my most popular; since then much has changed.
There's a new kid on the block: JAX. A thin, but powerful layer over Autograd and XLA, it makes it easy to concisely express algorithms with the same syntax as numpy while getting the full performance of TPUs and GPUs.
Especially in combination with higher level libraries such as Haiku, JAX makes it fun and easy to try new ideas. I've migrated all my own research to JAX and can only recommend it!
The resources I recommended in my previous post are still relevant; on top of that there's plenty of new material. Some of my favorites include:
- Dive into Deep Learning, an interactive deep learning book
- OpenAI's Spinning Up in Deep RL covers a set of different algorithms
- Distill, an interactive online journal with great visualizations
Happy hacking!
Tags: ai, ml, programming