Skip to content

Remove deprecated callback hooks #14834

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 46 commits into from
Oct 10, 2022
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
4e92192
Remove deprecated callback hooks
awaelchli Sep 21, 2022
5467354
changelog
awaelchli Sep 21, 2022
a7d36dd
revert
awaelchli Sep 21, 2022
599deb5
removal
awaelchli Sep 21, 2022
9723db9
Remove deprecated on_load/save_checkpoint behavior
awaelchli Sep 21, 2022
e600d3f
Finish removal
carmocca Sep 21, 2022
cb2b76f
Merge branch 'master' into feature/remove/callback-hooks
carmocca Sep 22, 2022
8e7b965
Fix missed tests
carmocca Sep 22, 2022
363ba86
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Sep 26, 2022
3d8b778
merge master
Sep 27, 2022
419ef63
Merge branch 'master' into feature/remove/callback-hooks
Sep 27, 2022
f808144
fixed botched merge
Sep 27, 2022
94c5e03
fixed botched merge v2
Sep 27, 2022
1655841
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Sep 29, 2022
938578a
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Sep 29, 2022
1a7b9c4
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Sep 29, 2022
e209ec6
Merge branch 'master' into feature/remove/callback-hooks
otaj Sep 30, 2022
f5a749e
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 30, 2022
328a549
Merge branch 'master' into feature/remove/callback-hooks
otaj Sep 30, 2022
5d22054
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Sep 30, 2022
2c954c7
fix tests
awaelchli Sep 30, 2022
58dbc98
Merge branch 'master' into feature/remove/callback-hooks
Borda Oct 5, 2022
0b30b3e
Merge branch 'master' into feature/remove/callback-checkpoint-hooks
Borda Oct 5, 2022
758da2b
convert warnings to error
awaelchli Oct 8, 2022
34b9d5a
move tests to 2.0 file
awaelchli Oct 8, 2022
7c2c526
fix tests
awaelchli Oct 8, 2022
057441c
Merge branch 'master' into feature/remove/callback-checkpoint-hooks
awaelchli Oct 10, 2022
e64e14c
raise error
awaelchli Oct 10, 2022
582a233
error on legacy argument
awaelchli Oct 10, 2022
7e37d95
error instead of deprecation
awaelchli Oct 10, 2022
5440902
changelog
awaelchli Oct 10, 2022
5952296
fix mypy
awaelchli Oct 10, 2022
4681d79
flake
awaelchli Oct 10, 2022
ee5b55a
remove old tests
awaelchli Oct 10, 2022
861cce1
update tests
awaelchli Oct 10, 2022
e812b5a
address review
awaelchli Oct 10, 2022
5d5cfa4
merge
awaelchli Oct 10, 2022
301cc7f
version number
awaelchli Oct 10, 2022
921b0ad
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Oct 10, 2022
2d16b86
fix bad merge
awaelchli Oct 10, 2022
0db92d8
fix tests checking for old deprecation warnings
awaelchli Oct 10, 2022
8efb6e9
one more
awaelchli Oct 10, 2022
7635128
remove unused imports
awaelchli Oct 10, 2022
63de8bf
Merge branch 'master' into feature/remove/callback-hooks
awaelchli Oct 10, 2022
498e981
undo app change
awaelchli Oct 10, 2022
ed2db31
reset app changes
awaelchli Oct 10, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 0 additions & 9 deletions docs/source-pytorch/extensions/callbacks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -152,12 +152,6 @@ state_key
Hooks
=====

on_configure_sharded_model
^^^^^^^^^^^^^^^^^^^^^^^^^^

.. automethod:: pytorch_lightning.callbacks.Callback.on_configure_sharded_model
:noindex:

setup
^^^^^

Expand Down Expand Up @@ -266,9 +260,6 @@ on_predict_epoch_end
.. automethod:: pytorch_lightning.callbacks.Callback.on_predict_epoch_end
:noindex:

.. automethod:: pytorch_lightning.callbacks.Callback.on_epoch_end
:noindex:

on_validation_batch_start
^^^^^^^^^^^^^^^^^^^^^^^^^

Expand Down
2 changes: 1 addition & 1 deletion examples/app_components/python/component_tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ def on_train_start(self, trainer, pl_module) -> None:
print("Even the Lightning Work is available and state transfer works !")
print(self.lightning_work)

def on_batch_end(self, trainer, *_) -> None:
def on_train_batch_end(self, trainer, *_) -> None:
# On every batch end, collects some information.
# This is communicated automatically to the rest of the app,
# so you can track your training in real time in the Lightning App UI.
Expand Down
8 changes: 0 additions & 8 deletions src/lightning_app/utilities/introspection.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,8 +88,6 @@ class LightningModuleVisitor(LightningVisitor):
"on_fit_end",
"on_load_checkpoint",
"on_save_checkpoint",
"on_pretrain_routine_start",
"on_pretrain_routine_end",
"on_test_batch_start",
"on_test_batch_end",
"on_test_epoch_start",
Expand Down Expand Up @@ -184,18 +182,12 @@ class LightningCallbackVisitor(LightningVisitor):
"on_validation_epoch_end",
"on_test_epoch_start",
"on_test_epoch_end",
"on_epoch_start",
"on_epoch_end",
"on_batch_start",
"on_validation_batch_start",
"on_validation_batch_end",
"on_test_batch_start",
"on_test_batch_end",
"on_batch_end",
"on_train_start",
"on_train_end",
"on_pretrain_routine_start",
"on_pretrain_routine_end",
"on_validation_start",
"on_validation_end",
"on_test_start",
Expand Down
10 changes: 10 additions & 0 deletions src/pytorch_lightning/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -210,6 +210,16 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed the deprecated way to set the distributed backend via the environment variable `PL_TORCH_DISTRIBUTED_BACKEND`, in favor of setting the `process_group_backend` in the strategy constructor ([#14693](https://github.com/Lightning-AI/lightning/pull/14693))


- Removed deprecated callback hooks ([#14834](https://github.com/Lightning-AI/lightning/pull/14834))
* `Callback.on_configure_sharded_model` in favor of `Callback.setup`
* `Callback.on_before_accelerator_backend_setup` in favor of `Callback.setup`
* `Callback.on_batch_start` in favor of `Callback.on_train_batch_start`
* `Callback.on_batch_end` in favor of `Callback.on_train_batch_end`
* `Callback.on_epoch_start` in favor of `Callback.on_{train,validation,test}_epoch_start`
* `Callback.on_epoch_end` in favor of `Callback.on_{train,validation,test}_epoch_end`
* `Callback.on_pretrain_routine_{start,end}` in favor of `Callback.on_fit_start`


- Removed the deprecated device attributes `Trainer.{devices,gpus,num_gpus,ipus,tpu_cores}` in favor of the accelerator-agnostic `Trainer.num_devices` ([#14829](https://github.com/Lightning-AI/lightning/pull/14829))


Expand Down
70 changes: 0 additions & 70 deletions src/pytorch_lightning/callbacks/callback.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,22 +56,6 @@ def _generate_state_key(self, **kwargs: Any) -> str:
"""
return f"{self.__class__.__qualname__}{repr(kwargs)}"

def on_configure_sharded_model(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use `setup()` instead.

Called before configure sharded model.
"""

def on_before_accelerator_backend_setup(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use ``setup()`` instead.

Called before accelerator is being setup.
"""

def setup(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule", stage: str) -> None:
"""Called when fit, validate, test, predict, or tune begins."""

Expand Down Expand Up @@ -146,42 +130,6 @@ def on_predict_epoch_start(self, trainer: "pl.Trainer", pl_module: "pl.Lightning
def on_predict_epoch_end(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule", outputs: List[Any]) -> None:
"""Called when the predict epoch ends."""

def on_epoch_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use
``on_<train/validation/test>_epoch_start`` instead.

Called when either of train/val/test epoch begins.
"""

def on_epoch_end(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use
``on_<train/validation/test>_epoch_end`` instead.

Called when either of train/val/test epoch ends.
"""

def on_batch_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use
``on_train_batch_start`` instead.

Called when the training batch begins.
"""

def on_batch_end(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6
This callback hook was deprecated in v1.6 and will be removed in v1.8. Use
``on_train_batch_end`` instead.

Called when the training batch ends.
"""

def on_validation_batch_start(
self, trainer: "pl.Trainer", pl_module: "pl.LightningModule", batch: Any, batch_idx: int, dataloader_idx: int
) -> None:
Expand Down Expand Up @@ -236,24 +184,6 @@ def on_train_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule")
def on_train_end(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
"""Called when the train ends."""

def on_pretrain_routine_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6

This callback hook was deprecated in v1.6 and will be removed in v1.8. Use ``on_fit_start`` instead.

Called when the pretrain routine begins.
"""

def on_pretrain_routine_end(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
r"""
.. deprecated:: v1.6

This callback hook was deprecated in v1.6 and will be removed in v1.8. Use ``on_fit_start`` instead.

Called when the pretrain routine ends.
"""

def on_validation_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule") -> None:
"""Called when the validation loop begins."""

Expand Down
8 changes: 0 additions & 8 deletions src/pytorch_lightning/callbacks/lambda_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,7 @@ class LambdaCallback(Callback):

def __init__(
self,
on_before_accelerator_backend_setup: Optional[Callable] = None,
setup: Optional[Callable] = None,
on_configure_sharded_model: Optional[Callable] = None,
teardown: Optional[Callable] = None,
on_init_start: Optional[Callable] = None,
on_init_end: Optional[Callable] = None,
Expand All @@ -58,18 +56,12 @@ def __init__(
on_validation_epoch_end: Optional[Callable] = None,
on_test_epoch_start: Optional[Callable] = None,
on_test_epoch_end: Optional[Callable] = None,
on_epoch_start: Optional[Callable] = None,
on_epoch_end: Optional[Callable] = None,
on_batch_start: Optional[Callable] = None,
on_validation_batch_start: Optional[Callable] = None,
on_validation_batch_end: Optional[Callable] = None,
on_test_batch_start: Optional[Callable] = None,
on_test_batch_end: Optional[Callable] = None,
on_batch_end: Optional[Callable] = None,
on_train_start: Optional[Callable] = None,
on_train_end: Optional[Callable] = None,
on_pretrain_routine_start: Optional[Callable] = None,
on_pretrain_routine_end: Optional[Callable] = None,
on_validation_start: Optional[Callable] = None,
on_validation_end: Optional[Callable] = None,
on_test_start: Optional[Callable] = None,
Expand Down
42 changes: 0 additions & 42 deletions src/pytorch_lightning/core/hooks.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,32 +63,6 @@ def on_predict_start(self) -> None:
def on_predict_end(self) -> None:
"""Called at the end of predicting."""

def on_pretrain_routine_start(self) -> None:
"""Called at the beginning of the pretrain routine (between fit and train start).

- fit
- pretrain_routine start
- pretrain_routine end
- training_start

.. deprecated:: v1.6
:meth:`on_pretrain_routine_start` has been deprecated in v1.6 and will be removed in v1.8.
Use ``on_fit_start`` instead.
"""

def on_pretrain_routine_end(self) -> None:
"""Called at the end of the pretrain routine (between fit and train start).

- fit
- pretrain_routine start
- pretrain_routine end
- training_start

.. deprecated:: v1.6
:meth:`on_pretrain_routine_end` has been deprecated in v1.6 and will be removed in v1.8.
Use ``on_fit_start`` instead.
"""

def on_train_batch_start(self, batch: Any, batch_idx: int) -> Optional[int]:
"""Called in the training loop before anything happens for that batch.

Expand Down Expand Up @@ -189,22 +163,6 @@ def on_predict_model_eval(self) -> None:
"""Sets the model to eval during the predict loop."""
self.trainer.model.eval()

def on_epoch_start(self) -> None:
"""Called when either of train/val/test epoch begins.

.. deprecated:: v1.6
:meth:`on_epoch_start` has been deprecated in v1.6 and will be removed in v1.8.
Use ``on_<train/validation/test>_epoch_start`` instead.
"""

def on_epoch_end(self) -> None:
"""Called when either of train/val/test epoch ends.

.. deprecated:: v1.6
:meth:`on_epoch_end` has been deprecated in v1.6 and will be removed in v1.8.
Use ``on_<train/validation/test>_epoch_end`` instead.
"""

def on_train_epoch_start(self) -> None:
"""Called in the training loop at the very beginning of the epoch."""

Expand Down
6 changes: 1 addition & 5 deletions src/pytorch_lightning/loops/dataloader/evaluation_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,10 +267,8 @@ def _on_evaluation_end(self, *args: Any, **kwargs: Any) -> None:
self.trainer._logger_connector.reset_results()

def _on_evaluation_epoch_start(self, *args: Any, **kwargs: Any) -> None:
"""Runs ``on_epoch_start`` and ``on_{validation/test}_epoch_start`` hooks."""
"""Runs the ``on_{validation/test}_epoch_start`` hooks."""
self.trainer._logger_connector.on_epoch_start()
self.trainer._call_callback_hooks("on_epoch_start", *args, **kwargs)
self.trainer._call_lightning_module_hook("on_epoch_start", *args, **kwargs)

hook_name = "on_test_epoch_start" if self.trainer.testing else "on_validation_epoch_start"
self.trainer._call_callback_hooks(hook_name, *args, **kwargs)
Expand All @@ -295,8 +293,6 @@ def _on_evaluation_epoch_end(self) -> None:
self.trainer._call_callback_hooks(hook_name)
self.trainer._call_lightning_module_hook(hook_name)

self.trainer._call_callback_hooks("on_epoch_end")
self.trainer._call_lightning_module_hook("on_epoch_end")
self.trainer._logger_connector.on_epoch_end()

@staticmethod
Expand Down
4 changes: 0 additions & 4 deletions src/pytorch_lightning/loops/epoch/training_epoch_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,9 +200,6 @@ def advance(self, data_fetcher: AbstractDataFetcher) -> None: # type: ignore[ov
self._warning_cache.warn("train_dataloader yielded None. If this was on purpose, ignore this warning...")
batch_output = []
else:
# hook
self.trainer._call_callback_hooks("on_batch_start")

# hook
self.trainer._call_callback_hooks("on_train_batch_start", batch, batch_idx)
response = self.trainer._call_lightning_module_hook("on_train_batch_start", batch, batch_idx)
Expand Down Expand Up @@ -232,7 +229,6 @@ def advance(self, data_fetcher: AbstractDataFetcher) -> None: # type: ignore[ov

self.trainer._call_callback_hooks("on_train_batch_end", batch_end_outputs, batch, batch_idx)
self.trainer._call_lightning_module_hook("on_train_batch_end", batch_end_outputs, batch, batch_idx)
self.trainer._call_callback_hooks("on_batch_end")
self.trainer._logger_connector.on_batch_end()

self.batch_progress.increment_completed()
Expand Down
9 changes: 1 addition & 8 deletions src/pytorch_lightning/loops/fit_loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -219,8 +219,7 @@ def on_run_start(self) -> None: # type: ignore[override]
self.trainer._call_strategy_hook("on_train_start")

def on_advance_start(self) -> None: # type: ignore[override]
"""Prepares the dataloader for training and calls the hooks ``on_epoch_start`` and
``on_train_epoch_start``"""
"""Prepares the dataloader for training and calls the hook ``on_train_epoch_start``"""
model = self.trainer.lightning_module

# reset train dataloader
Expand All @@ -245,9 +244,6 @@ def on_advance_start(self) -> None: # type: ignore[override]

self.trainer._logger_connector.on_epoch_start()

self.trainer._call_callback_hooks("on_epoch_start")
self.trainer._call_lightning_module_hook("on_epoch_start")

self.trainer._call_callback_hooks("on_train_epoch_start")
self.trainer._call_lightning_module_hook("on_train_epoch_start")

Expand Down Expand Up @@ -298,9 +294,6 @@ def on_advance_end(self) -> None:
self.trainer._call_callback_hooks("on_train_epoch_end")
self.trainer._call_lightning_module_hook("on_train_epoch_end")

self.trainer._call_callback_hooks("on_epoch_end")
self.trainer._call_lightning_module_hook("on_epoch_end")

self.trainer._logger_connector.on_epoch_end()

if self.epoch_loop._num_ready_batches_reached():
Expand Down
Loading