Skip to content

Commit d76f491

Browse files
author
SeanNaren
committed
Merge branch 'master' into wip/acc
# Conflicts: # pytorch_lightning/accelerators/accelerator.py # pytorch_lightning/plugins/training_type/training_type_plugin.py
2 parents 61d2014 + 41be61c commit d76f491

File tree

3 files changed

+3
-14
lines changed

3 files changed

+3
-14
lines changed

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
6262
- Added reset dataloader hooks to Training Plugins and Accelerators ([#7861](https://github.com/PyTorchLightning/pytorch-lightning/pull/7861))
6363

6464

65+
- Added trainer stage hooks for Training Plugins and Accelerators ([#7864](https://github.com/PyTorchLightning/pytorch-lightning/pull/7864))
66+
67+
6568
### Changed
6669

6770
- Changed calling of `untoggle_optimizer(opt_idx)` out of the closure function ([#7563](https://github.com/PyTorchLightning/pytorch-lightning/pull/7563)

pytorch_lightning/accelerators/accelerator.py

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -591,12 +591,5 @@ def on_train_end(self) -> None:
591591
def on_train_batch_start(self, batch: Any, batch_idx: int, dataloader_idx: int) -> None:
592592
"""
593593
Called in the training loop before anything happens for that batch.
594-
595-
If you return -1 here, you will skip training for the rest of the current epoch.
596-
597-
Args:
598-
batch: The batched data as it is returned by the training DataLoader.
599-
batch_idx: the index of the batch
600-
dataloader_idx: the index of the dataloader
601594
"""
602595
return self.training_type_plugin.on_train_batch_start(batch, batch_idx, dataloader_idx)

pytorch_lightning/plugins/training_type/training_type_plugin.py

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -366,12 +366,5 @@ def on_predict_end(self):
366366
def on_train_batch_start(self, batch: Any, batch_idx: int, dataloader_idx: int) -> None:
367367
"""
368368
Called in the training loop before anything happens for that batch.
369-
370-
If you return -1 here, you will skip training for the rest of the current epoch.
371-
372-
Args:
373-
batch: The batched data as it is returned by the training DataLoader.
374-
batch_idx: the index of the batch
375-
dataloader_idx: the index of the dataloader
376369
"""
377370
pass

0 commit comments

Comments
 (0)