Day 24 - Custom Activation Functions, Initializers, Regularizers and Constraints
Custom Activations
def softplus_activation(z):
return tf.math.log(tf.exp(z) + 1.0)
Custom Initializers
def glorot_init(shape, dtype=tf.float32):
stddev = tf.sqrt(2. / (shape[0] + shape[1]))
return tf.random.normal(shape, stddev=stddev, dtype=dtype)
Custom Regularizers
def l1_regularizer(weights):
return tf.reduce_sum(tf.abs(0.01 * weights))
Custom Contstraints
def positive_weights_only(weights):
return tf.nn.relu(weights)
The above custom functions can then be passed to a layer when creating it:
keras.layers.Dense(
...
activation=softplus_activation,
kernel_initializer=glorot_init,
kernel_regularizer=l1_regularizer,
kernel_constraint=positive_weights_only
...
)
- When initializing the layer, its weights will be the return value of the initializer function.
- The activation function will be applied to the output of the Dense layer before being passed to the next layer in the network.
- At each training step, the weights will be passed to the regularizer function to calculate the regularization loss, which is added to the main loss.
- After each training step, the constraint function is called and the layer's weights are replaced by the weights returned by the function.