4 Activation Functions in Python to know!

Activation Functions In Python

Hello, readers! In this article, we will be focusing on Python Activation functions, in detail.

So, let us get started!! 🙂


What is an Activation function?

In the world of Neural networking and deep learning with convolutional models, Python has been playing a significant role when it comes to modeling and analysis of data.

Activation functions are the mathematical base model that enables us to control the output of the neural network model. That is, it helps us analyze and estimate whether a neuron contributing to the enablement of the model is to be kept within or removed (fired).

Some of the prominent Activation functions–

  1. ReLu function
  2. Leaky ReLu function
  3. Sigmoid function
  4. Softmax function
  5. Linear function, etc.

Having understood about Activation function, let us now have a look at the above activation functions in the upcoming section.


1. ReLu function

ReLu function is a type of Activation function that enables us to improvise the convolutional picture of the neural network. It detects the state of the neural network in terms of the model results.

The ReLu function states that when the input is negative, return zero. Else for a non-negative input, it returns one.

Example:

Here, we have implemented a user-defined function to inculcate the ReLu condition using max() function in Python.

def ReLu(ar):
    return max(0.0,ar)
ar = 1.0
print(ReLu(ar))
ar1= -1.0
print(ReLu(ar1))

Output–

1.0
0.0

2. Leaky ReLu function

The gradient score i.e. the derivative value for the non-zero input passed to the ReLu function was found to be zero. Which basically stated that the weights are not being updated properly by the learning function.

To overcome this Gradient issue of ReLu function, we have been introduced to Leaky ReLu function.

Leaky ReLu function attaches a small linear component(constant value) to the negative (non-zero) input weight being passed to the function. By this, the gradient score for these non-zero input weights turned out to be a non-zero value.

Example:

def ReLu(x):
  if x>0 :
    return x
  else :
    return 0.001*x

x = -1.0
print(ReLu(x))

Output:

-0.001

3. Sigmoid function

The Sigmoid Activation function is simply based on the below sigmoid mathematical formula–

Sigmoid formula
Sigmoid formula

As the denominator is always greater than one, thus the output of this activation function is always between 0 and 1.

Example:

import numpy as np 
def sigmoid(num):
 return 1/(1 + np.exp(-num))
num = -1.0
print(sigmoid(num))
 

Output:

0.2689414213699951

4. Softmax function

The softmax activation function can be termed as a mathematical model that accepts a vector of numeric data variables as input and then normalizes the data.

That is, it normalizes (scales the data values) to a probability distribution wherein the probability of every data value is proportional to the scale of every value present in the vector.

As a result, all the data values will be in the range of 0 – 1. Also, the summation of all the data values would be equal to 1 as they are being interpreted as probabilities.


Conclusion

By this, we have come to the end of this topic. Feel free to comment below, in case you come across any question.

For more such posts related to Python programming, Stay tuned with us.

Till then, Happy Learning!! 🙂