Day 14 - Nonsaturating Activation Functions


Rectified Linear Unit (ReLU)

Dying ReLUs

Using ReLU in Keras

keras.layers.Dense(50, activation="relu", kernel_initializer="he_normal")

Leaky ReLU

Using Leaky ReLU in Keras

...
keras.layers.Dense(10, kernel_initializer="he_normal"),
keras.layers.LeakyReLU(alpha=0.2),
...

Randomized Leaky ReLU (RReLU)

Using Leaky ReLU in Keras


Parameterized Leaky ReLU (PReLU)

Using PReLU in Keras

...
keras.layers.Dense(10, kernel_initializer="he_normal"),
keras.layers.PReLU(),
...

Exponential Linear Unit (ELU)

ELUα(z)={α(ez1),if z<0,z,if z>=0.\text{ELU}_{\alpha}(z) = \begin{cases} \alpha(e^z-1), & \text{if } z<0,\\ z, & \text{if } z >= 0. \end{cases}

Using ELU in Keras

...
keras.layers.Dense(10, kernel_initializer="he_normal"),
keras.layers.ELU(),
...

Scaled Exponential Linear Unit (SELU)

Using SELU in Keras

keras.layers.Dense(10, activation="selu", kernel_initializer="lecun_normal")

Rules of thumb