0
0
PyTorchml~20 mins

requires_grad flag in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Requires Grad Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output of requires_grad flag after tensor operations?

Consider the following PyTorch code snippet. What will be the value of z.requires_grad after the operations?

PyTorch
import torch
x = torch.tensor([1.0, 2.0, 3.0], requires_grad=True)
y = x * 2
y = y.detach()
z = y + 1
ANone
BTrue
CRaises an error
DFalse
Attempts:
2 left
💡 Hint

Think about what detach() does to the computation graph and the requires_grad flag.

Model Choice
intermediate
2:00remaining
Which tensor requires gradients for training a model?

You want to train a simple linear model in PyTorch. Which tensor should have requires_grad=True to update its values during training?

AInput data tensor
BModel parameters tensor
CTarget labels tensor
DRandom noise tensor
Attempts:
2 left
💡 Hint

Which tensors need gradients to update weights during backpropagation?

Hyperparameter
advanced
2:00remaining
How does setting requires_grad=False affect training speed?

What is the effect of setting requires_grad=False on some model parameters during training?

ACauses training to fail with an error
BSlows down training because gradients are still computed
CSpeeds up training by skipping gradient computation for those parameters
DHas no effect on training speed
Attempts:
2 left
💡 Hint

Think about what happens when gradients are not computed for some tensors.

🔧 Debug
advanced
2:00remaining
Why does this code raise an error related to requires_grad?

Examine the code below. Why does the backward call raise an error?

PyTorch
import torch
x = torch.tensor([1.0, 2.0, 3.0])
y = x * 2
z = y.sum()
z.backward()
ABecause x does not have requires_grad=True
BBecause y is not a tensor
CBecause sum() cannot be used before backward()
DBecause backward() requires an argument
Attempts:
2 left
💡 Hint

Check if the tensor you want gradients for has requires_grad=True.

🧠 Conceptual
expert
3:00remaining
What happens to requires_grad flag after in-place operations?

In PyTorch, if you perform an in-place operation on a tensor with requires_grad=True, what is the effect on the requires_grad flag and the computation graph?

AThe <code>requires_grad</code> flag remains True, but the computation graph may be corrupted causing errors during backward
BThe <code>requires_grad</code> flag automatically switches to False
CThe tensor becomes detached from the graph but keeps <code>requires_grad=True</code>
DIn-place operations are not allowed on tensors with <code>requires_grad=True</code> and raise an immediate error
Attempts:
2 left
💡 Hint

Consider how in-place changes affect gradient tracking and graph integrity.