Day 21 - Learning Rate Scheduling

Finding a good learning rate is important.

Finding a good learning rate:

Using a non-constant learning rate:

Examples of learning schedules:

Another way use learning schudule (specific to tf.keras):

Define the learning rate using one of the schedules available in keras.optimizers.schedules and then pass this learning rate to any optimizer.

learning_rate = keras.optimizers.schedules.ExponentialDecay(...)
optimizer = keras.optimizers.SGD(learning_rate)