Loss Functions

Activation Functions

Def Rectified Linear Unit ReLU is a non-linear activation function defined as:

Def Softmax