Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Aprenda Multi-Step Backpropagation | Preparing for Neural Networks
PyTorch Essentials
course content

Conteúdo do Curso

PyTorch Essentials

PyTorch Essentials

1. PyTorch Basics
2. Preparing for Neural Networks
3. Neural Networks

book
Multi-Step Backpropagation

Like Tensorflow, PyTorch also allows you to build more complex computational graphs involving multiple intermediate tensors.

12345678910111213
import torch # Create a 2D tensor with gradient tracking x = torch.tensor([[1.0, 2.0, 3.0], [3.0, 2.0, 1.0]], requires_grad=True) # Define intermediate layers y = 6 * x + 3 z = 10 * y ** 2 # Compute the mean of the final output output_mean = z.mean() print(f"Output: {output_mean}") # Perform backpropagation output_mean.backward() # Print the gradient of x print("Gradient of x:\n", x.grad)
copy

The gradient of output_mean with respect to x is computed using the chain rule. The result shows how much a small change in each element of x affects output_mean.

Disabling Gradient Tracking

In some cases, you may want to disable gradient tracking to save memory and computation. Since requires_grad=False is the default behavior, you can simply create the tensor without specifying this parameter:

Tarefa
test

Swipe to begin your solution

You are tasked with building a simple neural network in PyTorch. Your goal is to compute the gradient of the loss with respect to the weight matrix.

  1. Define a random weight matrix (tensor) W of shape 1x3 initialized with values from a uniform distribution over [0, 1], with gradient tracking enabled.
  2. Create an input matrix (tensor) X based on this list: [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]].
  3. Perform matrix multiplication to calculate Y.
  4. Compute mean squared error (MSE): loss = mean((Y - Ytarget)2).
  5. Calculate the gradient of the loss (loss) with respect to W using backpropagation.
  6. Print the computed gradient of W.

Solução

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 2. Capítulo 2
toggle bottom row

book
Multi-Step Backpropagation

Like Tensorflow, PyTorch also allows you to build more complex computational graphs involving multiple intermediate tensors.

12345678910111213
import torch # Create a 2D tensor with gradient tracking x = torch.tensor([[1.0, 2.0, 3.0], [3.0, 2.0, 1.0]], requires_grad=True) # Define intermediate layers y = 6 * x + 3 z = 10 * y ** 2 # Compute the mean of the final output output_mean = z.mean() print(f"Output: {output_mean}") # Perform backpropagation output_mean.backward() # Print the gradient of x print("Gradient of x:\n", x.grad)
copy

The gradient of output_mean with respect to x is computed using the chain rule. The result shows how much a small change in each element of x affects output_mean.

Disabling Gradient Tracking

In some cases, you may want to disable gradient tracking to save memory and computation. Since requires_grad=False is the default behavior, you can simply create the tensor without specifying this parameter:

Tarefa
test

Swipe to begin your solution

You are tasked with building a simple neural network in PyTorch. Your goal is to compute the gradient of the loss with respect to the weight matrix.

  1. Define a random weight matrix (tensor) W of shape 1x3 initialized with values from a uniform distribution over [0, 1], with gradient tracking enabled.
  2. Create an input matrix (tensor) X based on this list: [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]].
  3. Perform matrix multiplication to calculate Y.
  4. Compute mean squared error (MSE): loss = mean((Y - Ytarget)2).
  5. Calculate the gradient of the loss (loss) with respect to W using backpropagation.
  6. Print the computed gradient of W.

Solução

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 2. Capítulo 2
Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
We're sorry to hear that something went wrong. What happened?
some-alt