Skip to content

Commit 0f312df

Browse files
harshaljanjanisvekarssekyondaMeta
authored
docs: Fix incorrect usage description of ctx.save_for_backward (#3377)
* docs: Fix incorrect usage description of ctx.save_for_backward --------- Co-authored-by: Svetlana Karslioglu <[email protected]> Co-authored-by: sekyondaMeta <[email protected]>
1 parent 7584f2f commit 0f312df

File tree

1 file changed

+5
-2
lines changed

1 file changed

+5
-2
lines changed

beginner_source/examples_autograd/polynomial_custom_function.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,8 +33,11 @@ def forward(ctx, input):
3333
"""
3434
In the forward pass we receive a Tensor containing the input and return
3535
a Tensor containing the output. ctx is a context object that can be used
36-
to stash information for backward computation. You can cache arbitrary
37-
objects for use in the backward pass using the ctx.save_for_backward method.
36+
to stash information for backward computation. You can cache tensors for
37+
use in the backward pass using the ``ctx.save_for_backward`` method. Other
38+
objects can be stored directly as attributes on the ctx object, such as
39+
``ctx.my_object = my_object``. Check out `Extending torch.autograd <https://docs.pytorch.org/docs/stable/notes/extending.html#extending-torch-autograd>`_
40+
for further details.
3841
"""
3942
ctx.save_for_backward(input)
4043
return 0.5 * (5 * input ** 3 - 3 * input)

0 commit comments

Comments
 (0)