Hate to break this, but Mathematics is tiresome! It is a lot more intense than one can think of, especially when one enters the territory of differentiation and integration. To make things worse, the differentiation is flexible enough to be done partially.

Good Lord above heavens! But hey, we have got Python to carry out some of the stuff for us. Phew!

This article explains on the deployment of theÂ *gradient( )Â *function within theÂ *numpyÂ *library of Python for usage against the arrays of N-dimensions.

*Also read: NumPy nanmax â€“ Maximum of an array along an axis ignoring any NaNs*

**Syntax ofÂ ***numpy.gradient()*

*numpy.gradient()*

Let us first have a look at the syntax of the *gradient( ) *function before getting to know the hows and whats of using it.

```
numpy.gradient( array, varags, axis=None, edge_order=1)
```

where,

*array*â€“ a collection of scalar entities of N-dimensions*varags*â€“ an optional provision to include the variable arguments which dictate the spacing for each dimension within the input array*axisÂ*â€“ an optional provision that dictates the direction for calculating the gradient & is set to none by default*edge_order â€“*an optional provision that deals with the boundaries at which the gradient is to be calculated. It can be set as â€˜1â€™ or â€˜2â€™, with the former being the default setting

Python calculates the gradient by finding the difference,

- between the adjacent numbers at boundaries & dividing by default spacing â€˜1â€™
- between the adjacent numbers in the interior & dividing by default spacing â€˜2â€™

**Calculating Gradient with Uniform Spacing:**

After importing the *numpy *library let us start by constructing an array & then finding its gradient with uniform spacing, say â€˜3â€™ as given below.

```
import numpy as np
ar = np.array([1.2, 3.4, 5.6])
np.gradient(ar,3)
```

The same can also be done with the default setting of the *gradient( ) *function without explicitly mentioning the required spacing. In this case, the spacing is set to the default values stated earlier in the syntax section of this article. Also, it is to be noted that one can also specify the data type as â€˜*floatâ€™ *within the array in case one ought to find the gradient for a bunch of whole numbers.

```
ar = np.array([1, 3, 5, 9], dtype = float)
np.gradient(ar)
```

**Calculating Gradient with Non-Uniform Spacing**

One can also use exclusive spacing by assigning an array with the required spacing for each element of the input array within the *gradient( ) *function. But, one ought to bear in mind that this array should be of the same dimension as that of the input array.

```
ar = np.array([1.2, 3.4, 5.6], dtype = float)
sp = np.array([7.8, 9.0, 0.1], dtype = float)
np.gradient(ar,sp)
```

**Calculating Gradient for N-Dimensional Array:**

When a multi-dimensional array arrives into the picture, the *gradient( ) *function shall return two different results irrespective of whether one provides uniform or non-uniform spacing. This leaves us wondering *â€˜Now, why would that happen?!â€™.*

This outcome can be attributed to the fact that one result corresponds to the gradient calculation with respect to the rows and the other corresponds to the gradient calculation with respect to the columns of the input array.

Let us try calculating the gradient of a two-dimensional array with a uniform spacing of â€˜2â€™, as shown below.

```
ar =([[1.2, 3.4, 5.6], [7.8, 9.0, 0.1]])
np.gradient(ar,2)
```

The first result in the above image is the gradient calculated with respect to the columns and the one that follows is the gradient calculated with respect to the rows.

**Conclusion:**

Now that we have reached the end of this article, hope it has elaborated on how to find the gradient of an N-dimensional array in Python. Hereâ€™s another article that explains how to return the reciprocal of each element using numpy in Python. There are numerous other enjoyable & equally informative articles in AskPython that might be of great help to those who are looking to level up in Python. Whilst you enjoy those,Â *hasta luego*!