Skip to content

Commit 5b772de

Browse files
committed
last commit prior to documentation/pl_example porting
1 parent 28911d4 commit 5b772de

File tree

4 files changed

+17
-16
lines changed

4 files changed

+17
-16
lines changed

docs/source/advanced/finetuning_scheduler.rst

Lines changed: 15 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,9 @@ transfer learning [#]_ [#]_ [#]_ .
1717
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts.FinetuningScheduler` orchestrates the gradual unfreezing
1818
of models via a finetuning schedule that is either implicitly generated (the default) or explicitly provided by the user
1919
(more computationally efficient). Finetuning phase transitions are driven by
20-
:class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping` criteria, user-specified epoch transitions or a
21-
composition of the two (the default mode). A
20+
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSEarlyStopping` criteria (a multi-phase
21+
extension of :class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping`), user-specified epoch transitions
22+
or a composition of the two (the default mode). A
2223
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts.FinetuningScheduler` training session completes when the
2324
final phase of the schedule has its stopping criteria met. See
2425
:ref:`Early Stopping<common/early_stopping:Early stopping>` for more details on that callback's configuration.
@@ -31,14 +32,15 @@ Basic Example
3132
If no finetuning schedule is user-provided,
3233
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts.FinetuningScheduler` will generate a
3334
:ref:`default schedule<advanced/finetuning_scheduler:The Default Finetuning Schedule>` and proceed to finetune
34-
according to the generated schedule, using default :class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping`
35+
according to the generated schedule, using default
36+
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSEarlyStopping`
3537
and :class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSCheckpoint` callbacks with
3638
``monitor=val_loss``.
3739

3840
.. code-block:: python
3941
4042
from pytorch_lightning import Trainer
41-
from pytorch_lightning.callbacks import FinetuningScheduler
43+
from pytorch_lightning.callbacks.finetuning_scheduler import FinetuningScheduler
4244
4345
trainer = Trainer(callbacks=[FinetuningScheduler()])
4446
@@ -70,7 +72,7 @@ and executed in ascending order.
7072
.. code-block:: python
7173
7274
from pytorch_lightning import Trainer
73-
from pytorch_lightning.callbacks import FinetuningScheduler
75+
from pytorch_lightning.callbacks.finetuning_scheduler import FinetuningScheduler
7476
7577
trainer = Trainer(callbacks=[FinetuningScheduler(gen_ft_sched_only=True)])
7678
@@ -133,19 +135,20 @@ and executed in ascending order.
133135
.. code-block:: python
134136
135137
from pytorch_lightning import Trainer
136-
from pytorch_lightning.callbacks import FinetuningScheduler
138+
from pytorch_lightning.callbacks.finetuning_scheduler import FinetuningScheduler
137139
138140
trainer = Trainer(callbacks=[FinetuningScheduler(ft_schedule="/path/to/my/schedule/my_schedule.yaml")])
139141
140142
EarlyStopping and Epoch-Driven Phase Transition Criteria
141143
========================================================
142144

143-
By default, :class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping` and epoch-driven transition criteria are
144-
composed. If a ``max_transition_epoch`` is specified for a given phase, the next finetuning phase will begin at that
145-
epoch unless :class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping` criteria are met first.
145+
By default, :class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSEarlyStopping` and epoch-driven
146+
transition criteria are composed. If a ``max_transition_epoch`` is specified for a given phase, the next finetuning
147+
phase will begin at that epoch unless
148+
:class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSEarlyStopping` criteria are met first.
146149
If :paramref:`~pytorch_lightning.callbacks.finetuning_scheduler.fts.FinetuningScheduler.epoch_transitions_only` is
147-
``True``, :class:`~pytorch_lightning.callbacks.early_stopping.EarlyStopping` will not be used and transitions will be
148-
exclusively epoch-driven.
150+
``True``, :class:`~pytorch_lightning.callbacks.finetuning_scheduler.fts_supporters.FTSEarlyStopping` will not be used
151+
and transitions will be exclusively epoch-driven.
149152

150153
.. tip::
151154

@@ -194,7 +197,7 @@ metadata.
194197
.. code-block:: python
195198
196199
from pytorch_lightning import Trainer
197-
from pytorch_lightning.callbacks import FinetuningScheduler
200+
from pytorch_lightning.callbacks.finetuning_scheduler import FinetuningScheduler
198201
199202
trainer = Trainer(callbacks=[FinetuningScheduler()], resume_from_checkpoint="some/path/to/my_checkpoint.ckpt")
200203

docs/source/api_references.rst

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -49,8 +49,6 @@ Callbacks API
4949
lr_monitor
5050
model_checkpoint
5151
progress
52-
finetuning_scheduler.fts
53-
finetuning_scheduler.fts_supporters
5452

5553
Loggers API
5654
-----------

docs/source/extensions/callbacks.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -87,7 +87,6 @@ Lightning has a few built-in callbacks.
8787
:nosignatures:
8888
:template: classtemplate.rst
8989

90-
FinetuningScheduler
9190
BackboneFinetuning
9291
BaseFinetuning
9392
BasePredictionWriter

pl_examples/basic_examples/fts_superglue.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,8 @@ def validation_epoch_end(self, outputs):
155155

156156
def init_pgs(self) -> List[Dict]:
157157
"""Initialize the parameter groups, necessary only for the baseline 'nofts_baseline.yaml' configuration
158-
that doesn't use the :class:`~pytorch_lightning.callbacks.FinetuningScheduler` callback.
158+
that doesn't use the :class:`~pytorch_lightning.callbacks.finetuning_scheduler.FinetuningScheduler`
159+
callback.
159160
160161
Returns:
161162
List[Dict]: A list of parameter group dictionaries.

0 commit comments

Comments
 (0)