Comprehensive Textbook that covers the following:
- Precursors for deep learning (Calculus, Linear Algebra, and Matrix Manipulations)
- Linear Neural Networks for Regressions and Classifications
- Multiplayer Layer Perceptrons (Algorithms and theory)
- Building Custom Layers using PyTorch API
- Foundations of efficient data loading and GPU utilizations
- Convolutional Neural Networks
- Modern Convolutional Neural Networks (CNNs with Blocks)
- Recurrent Neural Networks
- Attention Mechanisms and Transformers
- Optimization Algorithms
- Computational Performance Considerations
- Computer Vision
- Natural Language Processing
- Reinforcement Learning
- Gaussian Processes
- Hyperparameter Optimization
- Generative Adversarial Networks
- Recommender Systems
An intriguing textbook that is extremely well written and applicable. The author works you through the theory of neural networks from the first inception all the way to current state of the art models like ChatGPT4. The textbook has math theory built into it, but what is most interesting, is the author’s dedication to application through example problems in PyTorch throughout the whole length of the textbook. Not only do you get to understand the thorough history and math, you get to wok through thought provoking problems. I highly recommend this. I had a previous knowledge of deep learning before reading this textbook from courses and working through problems, but the authors’ knowledge of deep learning is something else. It is evident that the authors lived through and kept up with the machine learning boom, understanding the math and architectural advances that happened over the years.