Skip to content

Commit 7e82dfa

Browse files
committed
update changelog and deprecation test
1 parent bca9f40 commit 7e82dfa

File tree

3 files changed

+13
-1
lines changed

3 files changed

+13
-1
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -587,6 +587,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
587587
- Deprecated passing only the callback state to `Callback.on_load_checkpoint(callback_state)` in favor of passing the callback state to `Callback.load_state_dict` and in 1.8, passing the entire checkpoint dictionary to `Callback.on_load_checkpoint(checkpoint)` ([#11887](https://github.com/PyTorchLightning/pytorch-lightning/pull/11887))
588588

589589

590+
- Deprecated `Trainer.tpu_cores` in favor of `Trainer.num_devices` ([#12437](https://github.com/PyTorchLightning/pytorch-lightning/pull/12437))
591+
592+
590593
### Removed
591594

592595
- Removed deprecated parameter `method` in `pytorch_lightning.utilities.model_helpers.is_overridden` ([#10507](https://github.com/PyTorchLightning/pytorch-lightning/pull/10507))
@@ -795,6 +798,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
795798
- Removed `AcceleratorConnector.devices` property ([#12435](https://github.com/PyTorchLightning/pytorch-lightning/pull/12435))
796799

797800

801+
- Removed `AcceleratorConnector.tpu_cores` property ([#12437](https://github.com/PyTorchLightning/pytorch-lightning/pull/12437))
802+
803+
798804
### Fixed
799805

800806
- Fixed an issue where `ModelCheckpoint` could delete older checkpoints when `dirpath` has changed during resumed training ([#12045](https://github.com/PyTorchLightning/pytorch-lightning/pull/12045))

pytorch_lightning/callbacks/xla_stats_monitor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ def on_train_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule")
7676
if isinstance(trainer.accelerator, TPUAccelerator):
7777
raise MisconfigurationException(
7878
"You are using XLAStatsMonitor but are not running on TPU."
79-
f" The Trainer accelerator type is set to {trainer.accelerator.name().upper()}."
79+
f" The accelerator type is set to {trainer.accelerator.name().upper()}."
8080
)
8181

8282
device = trainer.strategy.root_device

tests/accelerators/test_tpu.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -103,6 +103,12 @@ def test_accelerator_tpu(accelerator, devices):
103103
assert isinstance(trainer.accelerator, TPUAccelerator)
104104
assert isinstance(trainer.strategy, TPUSpawnStrategy)
105105
assert trainer.num_devices == 8
106+
with pytest.deprecated_call(
107+
match= "`Trainer.tpu_cores` is deprecated in v1.6 and will be removed in v1.8. "
108+
"Please use `Trainer.devices` instead."
109+
):
110+
trainer.tpu_cores == 8
111+
106112

107113

108114
@RunIf(tpu=True)

0 commit comments

Comments
 (0)