Skip to content

Commit 951f3a8

Browse files
author
Svetlana Karslioglu
authored
Update edit-ongithub-note (pytorch#1985)
1 parent a885873 commit 951f3a8

13 files changed

+17
-11
lines changed

_static/pencil-16.png

326 Bytes
Loading

advanced_source/generic_join.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Distributed Training with Uneven Inputs Using the Join Context Manager
44
**Author**\ : `Andrew Gu <https://github.com/andwgu>`_
55

66
.. note::
7-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/generic_join.rst>`__.
7+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/generic_join.rst>`__.
88

99
.. note:: ``Join`` is introduced in PyTorch 1.10 as a prototype feature. This
1010
API is subject to change.

advanced_source/rpc_ddp_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Combining Distributed DataParallel with Distributed RPC Framework
33
**Authors**: `Pritam Damania <https://github.com/pritamdamania87>`_ and `Yi Wang <https://github.com/SciPioneer>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/rpc_ddp_tutorial.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/advanced_source/rpc_ddp_tutorial.rst>`__.
77

88
This tutorial uses a simple example to demonstrate how you can combine
99
`DistributedDataParallel <https://pytorch.org/docs/stable/nn.html#torch.nn.parallel.DistributedDataParallel>`__ (DDP)

beginner_source/dist_overview.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ PyTorch Distributed Overview
33
**Author**: `Shen Li <https://mrshenli.github.io/>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/beginner_source/dist_overview.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/beginner_source/dist_overview.rst>`__.
77

88
This is the overview page for the ``torch.distributed`` package. The goal of
99
this page is to categorize documents into different topics and briefly

conf.py

+6
Original file line numberDiff line numberDiff line change
@@ -46,6 +46,12 @@
4646
warnings.warn('unable to load "torchvision" package')
4747
import pytorch_sphinx_theme
4848

49+
rst_epilog ="""
50+
.. |edit| image:: /_static/pencil-16.png
51+
:width: 16px
52+
:height: 16px
53+
"""
54+
4955
# -- General configuration ------------------------------------------------
5056

5157
# If your documentation needs a minimal Sphinx version, state it here.

intermediate_source/FSDP_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Getting Started with Fully Sharded Data Parallel(FSDP)
44
**Author**: `Hamid Shojanazeri <https://github.com/HamidShojanazeri>`__, `Yanli Zhao <https://github.com/zhaojuanmao>`__, `Shen Li <https://mrshenli.github.io/>`__
55

66
.. note::
7-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/FSDP_tutorial.rst>`__.
7+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/FSDP_tutorial.rst>`__.
88

99
Training AI models at a large scale is a challenging task that requires a lot of compute power and resources.
1010
It also comes with considerable engineering complexity to handle the training of these very large models.

intermediate_source/ddp_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Getting Started with Distributed Data Parallel
55
**Edited by**: `Joe Zhu <https://github.com/gunandrose4u>`_
66

77
.. note::
8-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/ddp_tutorial.rst>`__.
8+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/ddp_tutorial.rst>`__.
99

1010
Prerequisites:
1111

intermediate_source/dist_pipeline_parallel_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Distributed Pipeline Parallelism Using RPC
33
**Author**: `Shen Li <https://mrshenli.github.io/>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_pipeline_parallel_tutorial.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_pipeline_parallel_tutorial.rst>`__.
77

88
Prerequisites:
99

intermediate_source/dist_tuto.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Writing Distributed Applications with PyTorch
33
**Author**: `Séb Arnold <https://seba1511.com>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_tuto.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/dist_tuto.rst>`__.
77

88
Prerequisites:
99

intermediate_source/process_group_cpp_extension_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Customize Process Group Backends Using Cpp Extensions
44
**Author**: `Feng Tian <https://github.com/ftian1>`__, `Shen Li <https://mrshenli.github.io/>`__
55

66
.. note::
7-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/process_group_cpp_extension_tutorial.rst>`__.
7+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/process_group_cpp_extension_tutorial.rst>`__.
88

99
Prerequisites:
1010

intermediate_source/rpc_async_execution.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Implementing Batch RPC Processing Using Asynchronous Executions
33
**Author**: `Shen Li <https://mrshenli.github.io/>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_async_execution.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_async_execution.rst>`__.
77

88
Prerequisites:
99

intermediate_source/rpc_param_server_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Implementing a Parameter Server Using Distributed RPC Framework
55
**Author**\ : `Rohan Varma <https://github.com/rohan-varma>`_
66

77
.. note::
8-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_param_server_tutorial.rst>`__.
8+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_param_server_tutorial.rst>`__.
99

1010
Prerequisites:
1111

intermediate_source/rpc_tutorial.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Getting Started with Distributed RPC Framework
33
**Author**: `Shen Li <https://mrshenli.github.io/>`_
44

55
.. note::
6-
View the source code for this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_tutorial.rst>`__.
6+
|edit| View and edit this tutorial in `github <https://github.com/pytorch/tutorials/blob/master/intermediate_source/rpc_tutorial.rst>`__.
77

88
Prerequisites:
99

0 commit comments

Comments
 (0)