有两种方式直接把模型的参数梯度设成0:
model.zero_grad()
optimizer.zero_grad() # 当optimizer=(())时,两者等效
如果想要把某一Variable的梯度置为0,只需用以下语句:
.zero_()
# Zero the gradients before running the backward pass.
model.zero_grad()
# Before the backward pass, use the optimizer object to zero all of the
# gradients for the variables it will update (which are the learnable weights
# of the model)
optimizer.zero_grad()