Posted on Sun 05 July 2020

Getting into Machine Learning - 2020

My previous Getting into Machine Learning post is one of my most popular; since then much has changed.

There's a new kid on the block: JAXcache. A thin, but powerful layer over Autograd and XLA, it makes it easy to concisely express algorithms with the same syntax as numpy while getting the full performance of TPUs and GPUs.

Especially in combination with higher level libraries such as Haikucache, JAX makes it fun and easy to try new ideas. I've migrated all my own research to JAX and can only recommend it!

The resources I recommended in my previous post are still relevant; on top of that there's plenty of new material. Some of my favorites include:

Happy hacking!

Tags: ai, ml, programming

© Julian Schrittwieser. Built using Pelican. Theme by Giulio Fidente on github. .