Skip to content

Commit

Permalink
beginner/blitz/nn: Fix misleading typo on which term to be differenti…
Browse files Browse the repository at this point in the history
…ated against (pytorch#726)

Co-authored-by: holly1238 <[email protected]>
  • Loading branch information
rht and holly1238 authored Apr 25, 2021
1 parent 424f027 commit ff0cfa1
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions beginner_source/blitz/neural_networks_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -176,8 +176,9 @@ def num_flat_features(self, x):
# -> loss
#
# So, when we call ``loss.backward()``, the whole graph is differentiated
# w.r.t. the loss, and all Tensors in the graph that have ``requires_grad=True``
# will have their ``.grad`` Tensor accumulated with the gradient.
# w.r.t. the neural net parameters, and all Tensors in the graph that have
# ``requires_grad=True`` will have their ``.grad`` Tensor accumulated with the
# gradient.
#
# For illustration, let us follow a few steps backward:

Expand Down

0 comments on commit ff0cfa1

Please sign in to comment.