100 Days of Deep Learning

Day 1 - The history of neural networks

Day 2 - Gradient descent

Day 3 - Backpropagation

Day 4 - Regression and Classification using MLPs

Day 5 - Using the Keras Sequential API

Day 6 - Using the Keras Functional API

Day 7 - Building Dynamic Models with the Keras Subclassing API

Day 8 - Saving and Restoring Models

Day 9 - Using Callbacks in Keras

Day 10 - Using TensorBoard for Visualization

Day 11 - Fine-Tuning Neural Network Hyperparameters

Day 12 - Vanishing and Exploding Gradients

Day 13 - Glorot & He Initialization

Day 14 - Nonsaturating Activation Functions

Day 15 - Batch Normalization

Day 16 - Gradient Clipping

Day 17 - Reusing Pretrained Layers

Day 18 - Momentum Optimization

Day 19 - Nesterov Accelerated Gradient

Day 20 - Adaptive Learning Rates

Day 21 - Learning Rate Scheduling

Day 22 - Preventing Overfitting using Regularization

Day 23 - Custom Losses in TensorFlow

Day 24 - Custom Activation Functions, Initializers, Regularizers and Constraints

Day 25 - Custom Metrics

Day 26 - Custom Layers in TensorFlow

Day 27 - Custom Models, Losses and Metrics in TensorFlow

Day 28 - Using Autodiff in TensorFlow

Day 29 - Custom Training Loop in TensorFlow

References