The sigmoid activation function in Python

The Sigmoid Activation Function

If you’re learning about neural networks, chances are high that you have come across the term activation function. In neural networks, an activation function decides whether a particular neuron will be activated or not. Activation functions take the weighted summation of the nodes as input and perform some mathematical computation, depending on the activation function, and output a value that decides whether a neuron will be activated or not.

They are many activation functions like

In this tutorial, we will be learning about the sigmoid activation function. So let’s begin!


What is the sigmoid function – The math behind it

Sigmoid is a non-linear activation function. It is mostly used in models where we need to predict the probability of something. As probability exists in the value range of 0 to 1, hence the range of sigmoid is also from 0 to 1, both inclusive.
Let’s have a look at the equation of the sigmoid function.

Sigmoid Equation
Sigmoid Equation

Sigmoid is usually denoted using the Greek symbol sigma. So, we can also write

Sigmoid Equation 1
Sigmoid Equation 1

In the above equation, ‘e‘ is Euler’s number. Its value is approximately 2.718. Similarly,

Sigmoid -x Eq 1
Sigmoid -x Equation 1

In fact, we can derive a relation between the above two equations as

Sigmoid X And -X Relation
Sigmoid x And -x Relation

We can also prove this relation as shown below:

LHS:

LHS equation 1
LHS Equation 1

It can also be written as

LHS equation 2
LHS Equation 2

RHS:

RHS Equation 1
RHS Equation 1
RHS Equation 2
RHS Equation 2

Therefore, LHS=RHS

Sigmoid x And -x Relation Equation 2
Sigmoid x And -x Relation Equation 2

Hence, we have proved the relation.

Another property of the sigmoid activation function is that it is differentiable. Let us see how we can differentiate it.

Differentiating sigmoid equation 1 we get

Sigmoid Differentiation Equation 1
Sigmoid Differentiation Equation 1
Sigmoid Differentiation Equation2
Sigmoid Differentiation Equation 2

So, from Sigmoid Equation 1, Sigmoid x And -x Relation Equation 2 and Sigmoid Differentiation Equation 2, we can write

Sigmoid And Its Differentiation Equation 1
Sigmoid And Its Differentiation Equation 1

Or,

Sigmoid And Its Differentiation Equation 2
Sigmoid And Its Differentiation Equation 2

Phew! That’s a lot of maths! Now, let us have a look at the graph of the sigmoid function.


Sigmoid graph using Python Matplotlib

#importing the required libraries
from math import exp
from matplotlib import pyplot as plt 

#defining the sigmoid function 
def sigmoid(x):
    return 1/(1+exp(-x))

#input 
input = []
for x in range(-5, 5):
    input.append(x)
    
#output
output = []
for ip in input:
    output.append(sigmoid(ip))
    
#plotting the graph
plt.plot(input, output)
plt.title("Sigmoid activation function")
plt.grid()
#adding labels to the axes
plt.xlabel("x")
plt.ylabel("sigmoid(x)")
plt.scatter([0], [0.5], color="red", zorder=5) 
plt.show()

Output:

Sigmoid Plot
Sigmoid Plot

The above plot leads us to a few properties of the sigmoid function. They are:

  • S-shaped: The graph of sigmoid is S-shaped just like the graph of tanh activation function.
  • Domain: The domain of sigmoid is (-∞, +∞).
  • Continuous: The sigmoid function is continuous everywhere.
  • The sigmoid function is monotonically increasing.
  • sigmoid(0)= 0.5

Relation between sigmoid and tanh

We have previously discussed the tanh activation function in our tutorial.

The equation of tanh is:

tanh Equation
tanh Equation

And,

Sigmoid Eq 2
Sigmoid(2x) Equation

These two functions are related:

Tanh Sigmoid Relation Equation
Tanh Sigmoid Relation Equation

Summary

Let’s have a quick recap: The sigmoid activation function is non-linear, monotonic, S-shaped, differentiable, and continuous. That’s all! We have learned about the sigmoid activation function and also its properties.

Hope you found this tutorial helpful. Do check out more such tutorials related to Python here.