Leaky ReLU Activation Function in Neural Networks
An activation function in Neural Networks is a function applied on each node in a layer, such that it produces an output based on its input. Functions such as Sigmoid Function or Step Functions are generally used as Activation functions in Neural Networks. One of such functions is the Rectified Linear Unit (ReLU). The ReLU …
Leaky ReLU Activation Function in Neural Networks Read More »