Add to Chrome
✅ The verified answer to this question is available below. Our community-reviewed solutions help you understand the material better.
What is the purpose of using the torch.no_grad() context manager or decorator in PyTorch?
torch.no_grad()
To automatically compute the gradients of a scalar loss with respect to all trainable parameters.
To build a static computational graph before executing the forward pass.
To zero out the gradients of the optimizer before a new backward pass is performed.
To enable hardware acceleration by moving tensors to the GPU or TPU.
To disable gradient tracking and reduce computational and memory overhead during inference or evaluation.
Get Unlimited Answers To Exam Questions - Install Crowdly Extension Now!