Skip to content

Commit 7b0272a

Browse files
Update definition of optimizer in introduction_guide.rst (#10822)
Co-authored-by: rohitgr7 <[email protected]>
1 parent 9906a1a commit 7b0272a

File tree

1 file changed

+26
-8
lines changed

1 file changed

+26
-8
lines changed

docs/source/starter/introduction_guide.rst

Lines changed: 26 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ equivalent to a pure PyTorch Module except it has added functionality. However,
122122
torch.Size([1, 10])
123123
124124
125-
Now we add the training_step which has all our training loop logic
125+
Now we add the ``training_step`` which has all our training loop logic:
126126

127127
.. testcode::
128128

@@ -133,11 +133,12 @@ Now we add the training_step which has all our training loop logic
133133
loss = F.nll_loss(logits, y)
134134
return loss
135135

136+
136137
Optimizer
137138
---------
138139

139-
Next, we choose which optimizer to use for training our model.
140-
In PyTorch, the optimizer is created as follows:
140+
Next we choose which optimizer to use for training our system.
141+
In PyTorch, we do it as follows:
141142

142143
.. code-block:: python
143144
@@ -146,23 +147,40 @@ In PyTorch, the optimizer is created as follows:
146147
optimizer = Adam(LitMNIST().parameters(), lr=1e-3)
147148
148149
149-
In Lightning, the code above is moved within the :func:`~pytorch_lightning.core.LightningModule.configure_optimizers` method of the LightningModule.
150+
In Lightning, the same code is re-organized within the :meth:`~pytorch_lightning.core.lightning.LightningModule.configure_optimizers` method.
150151

151152
.. testcode::
152153

153154
class LitMNIST(LightningModule):
154155
def configure_optimizers(self):
155156
return Adam(self.parameters(), lr=1e-3)
156157

157-
.. note:: The LightningModule is subclassing :class:`~torch.nn.Module` and therefore, you can access its children parameters directly with ``self.parameters()``.
158+
.. note:: The ``LightningModule`` is subclassing :class:`~torch.nn.Module` and therefore, you can access its children parameters directly with ``self.parameters()``.
158159

159160
If you have multiple optimizers, you can configure them as follows:
160161

161162
.. testcode::
162163

163164
class LitMNIST(LightningModule):
164165
def configure_optimizers(self):
165-
return Adam(self.generator(), lr=1e-3), Adam(self.discriminator(), lr=1e-3)
166+
return Adam(self.generator.parameters(), lr=1e-3), Adam(self.discriminator.parameters(), lr=1e-3)
167+
168+
If you have LR Schedulers you can return them too:
169+
170+
.. testcode::
171+
172+
from torch.optim.lr_scheduler import CosineAnnealingLR
173+
174+
175+
class LitMNIST(LightningModule):
176+
def configure_optimizers(self):
177+
opt = Adam(self.parameters(), lr=1e-3)
178+
scheduler = CosineAnnealingLR(opt, T_max=10)
179+
return [opt], [scheduler]
180+
181+
182+
For more available configurations, please checkout the :meth:`~pytorch_lightning.core.lightning.LightningModule.configure_optimizers` method.
183+
166184

167185
Data
168186
----
@@ -402,8 +420,8 @@ Training
402420
So far we defined 4 key ingredients in pure PyTorch but organized the code with the LightningModule.
403421

404422
1. Model.
405-
2. Training data.
406-
3. Optimizer.
423+
2. Optimizer.
424+
3. Training data.
407425
4. What happens in the training loop.
408426

409427
|

0 commit comments

Comments
 (0)