Day 24 - Custom Activation Functions, Initializers, Regularizers and Constraints

Custom Activations

def softplus_activation(z):
	return tf.math.log(tf.exp(z) + 1.0)

Custom Initializers

def glorot_init(shape, dtype=tf.float32):
	stddev = tf.sqrt(2. / (shape[0] + shape[1]))
	return tf.random.normal(shape, stddev=stddev, dtype=dtype)

Custom Regularizers

def l1_regularizer(weights):
	return tf.reduce_sum(tf.abs(0.01 * weights))

Custom Contstraints

def positive_weights_only(weights):
	return tf.nn.relu(weights)

The above custom functions can then be passed to a layer when creating it:

keras.layers.Dense(
	...
	activation=softplus_activation,
	kernel_initializer=glorot_init,
	kernel_regularizer=l1_regularizer,
	kernel_constraint=positive_weights_only
	...
)