My previous Getting into Machine Learning post is one of my most popular; since then much has changed.
There's a new kid on the block: JAX. A thin, but powerful layer over Autograd and XLA, it makes it easy to concisely express algorithms with the same syntax as numpy while getting the full performance of TPUs and GPUs.
Especially in combination with higher level libraries such as Haiku, JAX makes it fun and easy to try new ideas. I've migrated all my own research to JAX and can only recommend it!
The resources I recommended in my previous post are …