Preface
As a programmer and compiler writer, I became frustrated with how non-constructive and disjointed my learning experience was in the discipline of machine learning systems. I needed a single resource like SICP and it's spiritual successor PAPL which provided the Feynman-like counting-to-calculus progression, but for computation. In SICP and PAPL, you start with the elements of programming on a substitution model (i.e lambda calculus) iteratively deepening the semantics of computation with the stack and heap, and ending off with the implementation of your own virtual interpreter and a compiler to a physical register machine (i.e von neumann machine)
I needed a SICP for the era of software 2.0
The book is the answer to my original frustrations. It follows the SICP philosophy with a systems-bent, given how performance is especially a feature in the era of machine learning. If you feel similarly, you may benefit from it too. The Structure and Intepretation of Tensor Programs will take you from zero to hero in machine learning by implementing your own distributed parallel compiler for differentiating high-dimensional functions, line by line from scratch.
Good luck on your journey.
Are you ready to begin?