Skip to content

Commit

Permalink
Fix two usage errors: its vs it's (pytorch#1484)
Browse files Browse the repository at this point in the history
Co-authored-by: holly1238 <[email protected]>
  • Loading branch information
jamesonwilliams and holly1238 authored Apr 25, 2021
1 parent a06b381 commit 669107c
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion beginner_source/basics/autogradqs_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
# A function that we apply to tensors to construct computational graph is
# in fact an object of class ``Function``. This object knows how to
# compute the function in the *forward* direction, and also how to compute
# it's derivative during the *backward propagation* step. A reference to
# its derivative during the *backward propagation* step. A reference to
# the backward propagation function is stored in ``grad_fn`` property of a
# tensor. You can find more information of ``Function`` `in the
# documentation <https://pytorch.org/docs/stable/autograd.html#function>`__.
Expand Down
4 changes: 2 additions & 2 deletions beginner_source/basics/buildmodel_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ def forward(self, x):

##############################################
# We create an instance of ``NeuralNetwork``, and move it to the ``device``, and print
# it's structure.
# its structure.

model = NeuralNetwork().to(device)
print(model)
Expand Down Expand Up @@ -119,7 +119,7 @@ def forward(self, x):
# nn.Linear
# ^^^^^^^^^^^^^^^^^^^^^^
# The `linear layer <https://pytorch.org/docs/stable/generated/torch.nn.Linear.html>`_
# is a module that applies a linear transformation on the input using it's stored weights and biases.
# is a module that applies a linear transformation on the input using its stored weights and biases.
#
layer1 = nn.Linear(in_features=28*28, out_features=20)
hidden1 = layer1(flat_image)
Expand Down

0 comments on commit 669107c

Please sign in to comment.