How do PyTorch’s tensor objects use built-in functionality to calculate gradients?

Differentiation and calculating gradients play a crucial role in updating weights in a neural network. Pytorch takes care of that. Here’s an example.

define a tensor and set requires_grad to True

Now the loss function, in this case, let’s say, the sum of the square values or square loss.

loss = Σ

gradient = 2*x

Let’s see how it is calculated using PyTorch built-in function. The function used is backward().



Dipanwita Mallick

I am working as a Senior Data Scientist at Hewlett Packard Enterprise. I love exploring new ideas and new places !! :)