The tanh activation function

The Tanh Activation Function Cover Image

In deep learning, neural networks consist of neurons that work in correspondence with their weight, bias and respective activation functions. The weights and biases are adjusted based on the error in the output. This is called backpropagation. Activation functions make this process possible as they supply the gradients along with errors to update weight and bias.

Activation functions introduce non-linearity in the neural networks. They convert the linear input signals into non-linear output signals. Some of the activation functions are Sigmoid, ReLu, Softmax, tanh, etc.

In this tutorial, we’ll be learning about the tanh activation function. So let’s get started.


What is tanh?

Activation functions can either be linear or non-linear. tanh is the abbreviation for tangent hyperbolic. tanh is a non-linear activation function. It is an exponential function and is mostly used in multilayer neural networks, specifically for hidden layers.

Let us see the equation of the tanh function.

Tanh Equation 1
tanh Equation 1

Here, ‘e‘ is the Euler’s number, which is also the base of natural logarithm. It’s value is approximately 2.718.
On simplifying, this equation we get,

Tanh Eq 2
tanh Equation 2

The tanh activation function is said to perform much better as compared to the sigmoid activation function. In fact, the tanh and sigmoid activation functions are co-related and can be derived from each other.


Relation between tanh and sigmoid activation function

The equation for sigmoid activaiton function is

Sigmoid Eq 1
Sigmoid Equation 1

Similarly, we can write,

Sigmoid Eq 2
Sigmoid Equation 2

So, from the equations tanh equation 1 and sigmoid equation 2 we can see the relation between these two as

Tanh Sigmoid Relation Eq
tanh Sigmoid Relation Equation

Now, let us try to plot the graph of the tanh function using Python.


Creating a tanh graph using Matplotlib

We will be using the matplotlib library to plot the graph. This is a vast library and we’ve covered it in much detail on our website. Here’s a list of all the matplotlib tutorials on AskPython.

#importing the required libraries
from math import exp
import matplotlib.pyplot as plt 

#defining the tanh function using equation 1
def tanh(x):
    return (exp(x)-exp(-x))/(exp(x)+exp(-x))

#input to the tanh function
input = []
for x in range(-5, 5):
    input.append(x)
    
#output of the tanh function
output = []
for ip in input:
    output.append(tanh(ip))
    
#plotting the graph for tanh function
plt.plot(input, output)
plt.grid()
#adding labels to the axes
plt.title("tanh activation function")
plt.xlabel('x')
plt.ylabel('tanh(x)')
plt.show()

Output:

Tanh Plot
tanh Plot using first equation

As can be seen above, the graph tanh is S-shaped. It can take values ranging from -1 to +1. Also, observe that the output here is zero-centered which is useful while performing backpropagation.

If instead of using the direct equation, we use the tanh and sigmoid the relation then the code will be:

#importing the required libraries
from math import exp
import matplotlib.pyplot as plt 

#defining the sigmoid function
def sigmoid(x):
    return 1/(1+exp(-x))

#defining the tanh function using the relation
def tanh(x):
    return 2*sigmoid(2*x)-1

#input to the tanh function
input = []
for x in range(-5, 5):
    input.append(x)
    
#output of the tanh function
output = []
for ip in input:
    output.append(tanh(ip))
    
#plotting the graph for tanh function
plt.plot(input, output)
plt.grid()
#adding labels to the axes
plt.title("tanh activation function")
plt.xlabel('x')
plt.ylabel('tanh(x)')
plt.show()

Output:

Tanh Plot
tanh Plot using second equation

The above two plots are exactly the same, verifying that the relation between them is correct.

The tanh function has been used in many NLP applications, including natural language processing and speech recognition.


Summary

That’s all! Hence, we have learned about the tanh activation function in this tutorial. You can also learn about the sigmoid activation function if you’re interested.