Pytorch RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

原因:  loss = loss1 + loss2, 而loss这个tensor默认的requires_grad是False,因此执行loss.backward()时会报标题的错误

解决办法: 将loss这个tensor的requires_grad属性设为True

loss.requires_grad_(True)

Pytorch RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

 

上一篇:PyTorch 的 Autograd


下一篇:深度学习与PyTorch | PyTorch完成线性回归 | 06