Skip to content

Commit a64438c

Browse files
authored
Centralize rank_zero_only utilities into their own module (#11747)
* Centralize rank_zero_only utilities into their own module Fixes #11746 * PossibleUserWarning * Update test_warnings.py * update imports * more imports * Update CHANGELOG.md * Update mlflow.py * Update cli.py * Update api_references.rst * Update meta.py * add deprecation tests * debug standalone * fix standalone tests * Update CHANGELOG.md
1 parent 34c454c commit a64438c

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

71 files changed

+370
-239
lines changed

CHANGELOG.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
9292
- Added support for `Bagua` training strategy ([#11146](https://github.com/PyTorchLightning/pytorch-lightning/pull/11146))
9393

9494

95+
- Added `rank_zero` module to centralize utilities ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
96+
97+
9598
### Changed
9699

97100
- Implemented a new native and rich format in `_print_results` method of the `EvaluationLoop` ([#11332](https://github.com/PyTorchLightning/pytorch-lightning/pull/11332))
@@ -323,6 +326,24 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
323326
- Deprecated `on_configure_sharded_model` callback hook in favor of `setup` ([#11627](https://github.com/PyTorchLightning/pytorch-lightning/pull/11627))
324327

325328

329+
- Deprecated `pytorch_lightning.utilities.distributed.rank_zero_only` in favor of `pytorch_lightning.utilities.rank_zero.rank_zero_only` ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
330+
331+
332+
- Deprecated `pytorch_lightning.utilities.distributed.rank_zero_debug` in favor of `pytorch_lightning.utilities.rank_zero.rank_zero_debug` ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
333+
334+
335+
- Deprecated `pytorch_lightning.utilities.distributed.rank_zero_info` in favor of `pytorch_lightning.utilities.rank_zero.rank_zero_info` ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
336+
337+
338+
- Deprecated `pytorch_lightning.utilities.warnings.rank_zero_warn` in favor of `pytorch_lightning.utilities.rank_zero.rank_zero_warn` ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
339+
340+
341+
- Deprecated `pytorch_lightning.utilities.warnings.rank_zero_deprecation` in favor of `pytorch_lightning.utilities.rank_zero.rank_zero_deprecation` ([#11747](https://github.com/PyTorchLightning/pytorch-lightning/pull/11747))
342+
343+
344+
- Deprecated `pytorch_lightning.utilities.warnings.LightningDeprecationWarning` in favor of `pytorch_lightning.utilities.rank_zero.LightningDeprecationWarning`
345+
346+
326347
### Removed
327348

328349
- Removed deprecated parameter `method` in `pytorch_lightning.utilities.model_helpers.is_overridden` ([#10507](https://github.com/PyTorchLightning/pytorch-lightning/pull/10507))

docs/source/api_references.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -289,5 +289,6 @@ Utilities API
289289
memory
290290
model_summary
291291
parsing
292+
rank_zero
292293
seed
293294
warnings

docs/source/common/checkpointing.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -98,7 +98,7 @@ Lightning automatically ensures that the model is saved only on the main process
9898
trainer.save_checkpoint("example.ckpt")
9999
100100
Not using :meth:`~pytorch_lightning.trainer.trainer.Trainer.save_checkpoint` can lead to unexpected behavior and potential deadlock. Using other saving functions will result in all devices attempting to save the checkpoint. As a result, we highly recommend using the Trainer's save functionality.
101-
If using custom saving functions cannot be avoided, we recommend using the :func:`~pytorch_lightning.utilities.distributed.rank_zero_only` decorator to ensure saving occurs only on the main process. Note that this will only work if all ranks hold the exact same state and won't work when using
101+
If using custom saving functions cannot be avoided, we recommend using the :func:`~pytorch_lightning.utilities.rank_zero.rank_zero_only` decorator to ensure saving occurs only on the main process. Note that this will only work if all ranks hold the exact same state and won't work when using
102102
model parallel distributed strategies such as deepspeed or sharded training.
103103

104104

docs/source/extensions/logging.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@ Make a Custom Logger
205205
********************
206206

207207
You can implement your own logger by writing a class that inherits from :class:`~pytorch_lightning.loggers.base.LightningLoggerBase`.
208-
Use the :func:`~pytorch_lightning.loggers.base.rank_zero_experiment` and :func:`~pytorch_lightning.utilities.distributed.rank_zero_only` decorators to make sure that only the first process in DDP training creates the experiment and logs the data respectively.
208+
Use the :func:`~pytorch_lightning.loggers.base.rank_zero_experiment` and :func:`~pytorch_lightning.utilities.rank_zero.rank_zero_only` decorators to make sure that only the first process in DDP training creates the experiment and logs the data respectively.
209209

210210
.. testcode::
211211

pl_examples/basic_examples/autoencoder.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,9 +25,9 @@
2525
import pytorch_lightning as pl
2626
from pl_examples import _DATASETS_PATH, cli_lightning_logo
2727
from pl_examples.basic_examples.mnist_datamodule import MNIST
28-
from pytorch_lightning.utilities import rank_zero_only
2928
from pytorch_lightning.utilities.cli import LightningCLI
3029
from pytorch_lightning.utilities.imports import _TORCHVISION_AVAILABLE
30+
from pytorch_lightning.utilities.rank_zero import rank_zero_only
3131

3232
if _TORCHVISION_AVAILABLE:
3333
import torchvision

pl_examples/domain_templates/computer_vision_fine_tuning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,8 +58,8 @@
5858
from pl_examples import cli_lightning_logo
5959
from pytorch_lightning import LightningDataModule
6060
from pytorch_lightning.callbacks.finetuning import BaseFinetuning
61-
from pytorch_lightning.utilities import rank_zero_info
6261
from pytorch_lightning.utilities.cli import LightningCLI
62+
from pytorch_lightning.utilities.rank_zero import rank_zero_info
6363

6464
log = logging.getLogger(__name__)
6565
DATA_URL = "https://storage.googleapis.com/mledu-datasets/cats_and_dogs_filtered.zip"

pytorch_lightning/callbacks/early_stopping.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@
2626

2727
import pytorch_lightning as pl
2828
from pytorch_lightning.callbacks.base import Callback
29-
from pytorch_lightning.utilities import rank_zero_warn
3029
from pytorch_lightning.utilities.exceptions import MisconfigurationException
30+
from pytorch_lightning.utilities.rank_zero import rank_zero_warn
3131

3232
log = logging.getLogger(__name__)
3333

pytorch_lightning/callbacks/finetuning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@
2626

2727
import pytorch_lightning as pl
2828
from pytorch_lightning.callbacks.base import Callback
29-
from pytorch_lightning.utilities import rank_zero_warn
3029
from pytorch_lightning.utilities.exceptions import MisconfigurationException
30+
from pytorch_lightning.utilities.rank_zero import rank_zero_warn
3131

3232
log = logging.getLogger(__name__)
3333

pytorch_lightning/callbacks/gpu_stats_monitor.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,10 @@
2929

3030
import pytorch_lightning as pl
3131
from pytorch_lightning.callbacks.base import Callback
32-
from pytorch_lightning.utilities import _AcceleratorType, rank_zero_deprecation, rank_zero_only
32+
from pytorch_lightning.utilities import _AcceleratorType
3333
from pytorch_lightning.utilities.exceptions import MisconfigurationException
3434
from pytorch_lightning.utilities.parsing import AttributeDict
35+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_only
3536
from pytorch_lightning.utilities.types import STEP_OUTPUT
3637

3738

pytorch_lightning/callbacks/lr_monitor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,8 @@
2727

2828
import pytorch_lightning as pl
2929
from pytorch_lightning.callbacks.base import Callback
30-
from pytorch_lightning.utilities import rank_zero_deprecation, rank_zero_warn
3130
from pytorch_lightning.utilities.exceptions import MisconfigurationException
31+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_warn
3232
from pytorch_lightning.utilities.types import LRSchedulerConfig
3333

3434

pytorch_lightning/callbacks/model_checkpoint.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,9 +33,9 @@
3333

3434
import pytorch_lightning as pl
3535
from pytorch_lightning.callbacks.base import Callback
36-
from pytorch_lightning.utilities import rank_zero_info, rank_zero_warn
3736
from pytorch_lightning.utilities.cloud_io import get_filesystem
3837
from pytorch_lightning.utilities.exceptions import MisconfigurationException
38+
from pytorch_lightning.utilities.rank_zero import rank_zero_info, rank_zero_warn
3939
from pytorch_lightning.utilities.types import _METRIC, _PATH, STEP_OUTPUT
4040
from pytorch_lightning.utilities.warnings import WarningCache
4141

pytorch_lightning/callbacks/progress/base.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515

1616
import pytorch_lightning as pl
1717
from pytorch_lightning.callbacks import Callback
18-
from pytorch_lightning.utilities import rank_zero_warn
18+
from pytorch_lightning.utilities.rank_zero import rank_zero_warn
1919

2020

2121
class ProgressBarBase(Callback):

pytorch_lightning/callbacks/progress/tqdm_progress.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727

2828
import pytorch_lightning as pl
2929
from pytorch_lightning.callbacks.progress.base import ProgressBarBase
30-
from pytorch_lightning.utilities.distributed import rank_zero_debug
30+
from pytorch_lightning.utilities.rank_zero import rank_zero_debug
3131

3232
_PAD_SIZE = 5
3333

pytorch_lightning/callbacks/pruning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,8 @@
3030
from pytorch_lightning.callbacks.base import Callback
3131
from pytorch_lightning.core.lightning import LightningModule
3232
from pytorch_lightning.utilities.apply_func import apply_to_collection
33-
from pytorch_lightning.utilities.distributed import rank_zero_debug, rank_zero_only
3433
from pytorch_lightning.utilities.exceptions import MisconfigurationException
34+
from pytorch_lightning.utilities.rank_zero import rank_zero_debug, rank_zero_only
3535

3636
log = logging.getLogger(__name__)
3737

pytorch_lightning/callbacks/stochastic_weight_avg.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,8 +24,8 @@
2424

2525
import pytorch_lightning as pl
2626
from pytorch_lightning.callbacks.base import Callback
27-
from pytorch_lightning.utilities import rank_zero_info, rank_zero_warn
2827
from pytorch_lightning.utilities.exceptions import MisconfigurationException
28+
from pytorch_lightning.utilities.rank_zero import rank_zero_info, rank_zero_warn
2929
from pytorch_lightning.utilities.types import LRSchedulerConfig
3030

3131
_AVG_FN = Callable[[torch.Tensor, torch.Tensor, torch.LongTensor], torch.FloatTensor]

pytorch_lightning/callbacks/timer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,8 +24,8 @@
2424
from pytorch_lightning.callbacks.base import Callback
2525
from pytorch_lightning.trainer.states import RunningStage
2626
from pytorch_lightning.utilities import LightningEnum
27-
from pytorch_lightning.utilities.distributed import rank_zero_info
2827
from pytorch_lightning.utilities.exceptions import MisconfigurationException
28+
from pytorch_lightning.utilities.rank_zero import rank_zero_info
2929

3030
log = logging.getLogger(__name__)
3131

pytorch_lightning/callbacks/xla_stats_monitor.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,8 +22,9 @@
2222

2323
import pytorch_lightning as pl
2424
from pytorch_lightning.callbacks.base import Callback
25-
from pytorch_lightning.utilities import _AcceleratorType, _TPU_AVAILABLE, rank_zero_deprecation, rank_zero_info
25+
from pytorch_lightning.utilities import _AcceleratorType, _TPU_AVAILABLE
2626
from pytorch_lightning.utilities.exceptions import MisconfigurationException
27+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_info
2728

2829
if _TPU_AVAILABLE:
2930
import torch_xla.core.xla_model as xm

pytorch_lightning/core/decorators.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14-
from pytorch_lightning.utilities import rank_zero_deprecation
14+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_warn
1515

1616
rank_zero_deprecation(
1717
"Using `pytorch_lightning.core.decorators.parameter_validation` is deprecated in v1.5, "
@@ -22,8 +22,6 @@
2222
from functools import wraps # noqa: E402
2323
from typing import Callable # noqa: E402
2424

25-
from pytorch_lightning.utilities import rank_zero_warn # noqa: E402
26-
2725

2826
def parameter_validation(fn: Callable) -> Callable:
2927
"""Validates that the module parameter lengths match after moving to the device. It is useful when tying

pytorch_lightning/core/lightning.py

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -38,20 +38,15 @@
3838
from pytorch_lightning.core.optimizer import LightningOptimizer
3939
from pytorch_lightning.core.saving import ModelIO
4040
from pytorch_lightning.trainer.connectors.logger_connector.fx_validator import _FxValidator
41-
from pytorch_lightning.utilities import (
42-
_IS_WINDOWS,
43-
_TORCH_GREATER_EQUAL_1_10,
44-
GradClipAlgorithmType,
45-
rank_zero_deprecation,
46-
rank_zero_warn,
47-
)
41+
from pytorch_lightning.utilities import _IS_WINDOWS, _TORCH_GREATER_EQUAL_1_10, GradClipAlgorithmType
4842
from pytorch_lightning.utilities.apply_func import apply_to_collection, convert_to_tensors
4943
from pytorch_lightning.utilities.cloud_io import get_filesystem
50-
from pytorch_lightning.utilities.distributed import distributed_available, rank_zero_debug, sync_ddp
44+
from pytorch_lightning.utilities.distributed import distributed_available, sync_ddp
5145
from pytorch_lightning.utilities.exceptions import MisconfigurationException
5246
from pytorch_lightning.utilities.memory import get_model_size_mb
5347
from pytorch_lightning.utilities.model_summary import ModelSummary, summarize
5448
from pytorch_lightning.utilities.parsing import collect_init_args
49+
from pytorch_lightning.utilities.rank_zero import rank_zero_debug, rank_zero_deprecation, rank_zero_warn
5550
from pytorch_lightning.utilities.signature_utils import is_param_in_hook_signature
5651
from pytorch_lightning.utilities.types import _METRIC_COLLECTION, EPOCH_OUTPUT, LRSchedulerTypeUnion, STEP_OUTPUT
5752
from pytorch_lightning.utilities.warnings import WarningCache

pytorch_lightning/core/optimizer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,9 @@
2121
from torch.optim import Optimizer
2222

2323
import pytorch_lightning as pl
24-
from pytorch_lightning.utilities import rank_zero_warn
2524
from pytorch_lightning.utilities.exceptions import MisconfigurationException
2625
from pytorch_lightning.utilities.model_helpers import is_overridden
26+
from pytorch_lightning.utilities.rank_zero import rank_zero_warn
2727
from pytorch_lightning.utilities.types import _Stateful, LRSchedulerConfig, LRSchedulerTypeTuple, ReduceLROnPlateau
2828

2929

pytorch_lightning/core/saving.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,12 +26,13 @@
2626
import torch
2727
import yaml
2828

29-
from pytorch_lightning.utilities import _OMEGACONF_AVAILABLE, AttributeDict, rank_zero_warn
29+
from pytorch_lightning.utilities import _OMEGACONF_AVAILABLE, AttributeDict
3030
from pytorch_lightning.utilities.apply_func import apply_to_collection
3131
from pytorch_lightning.utilities.cloud_io import get_filesystem
3232
from pytorch_lightning.utilities.cloud_io import load as pl_load
3333
from pytorch_lightning.utilities.migration import pl_legacy_patch
3434
from pytorch_lightning.utilities.parsing import parse_class_init_keys
35+
from pytorch_lightning.utilities.rank_zero import rank_zero_warn
3536

3637
log = logging.getLogger(__name__)
3738
PRIMITIVE_TYPES = (bool, int, float, str)

pytorch_lightning/loggers/base.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,7 @@
2626

2727
import pytorch_lightning as pl
2828
from pytorch_lightning.callbacks.model_checkpoint import ModelCheckpoint
29-
from pytorch_lightning.utilities import rank_zero_only
30-
from pytorch_lightning.utilities.warnings import rank_zero_deprecation
29+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_only
3130

3231

3332
def rank_zero_experiment(fn: Callable) -> Callable:

pytorch_lightning/loggers/comet.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,9 +26,10 @@
2626

2727
import pytorch_lightning as pl
2828
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
29-
from pytorch_lightning.utilities import _module_available, rank_zero_only
3029
from pytorch_lightning.utilities.exceptions import MisconfigurationException
30+
from pytorch_lightning.utilities.imports import _module_available
3131
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _flatten_dict
32+
from pytorch_lightning.utilities.rank_zero import rank_zero_only
3233

3334
log = logging.getLogger(__name__)
3435
_COMET_AVAILABLE = _module_available("comet_ml")

pytorch_lightning/loggers/csv_logs.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,8 @@
2828

2929
from pytorch_lightning.core.saving import save_hparams_to_yaml
3030
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
31-
from pytorch_lightning.utilities import rank_zero_warn
32-
from pytorch_lightning.utilities.distributed import rank_zero_only
3331
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params
32+
from pytorch_lightning.utilities.rank_zero import rank_zero_only, rank_zero_warn
3433

3534
log = logging.getLogger(__name__)
3635

pytorch_lightning/loggers/mlflow.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,9 @@
2323
from typing import Any, Dict, Optional, Union
2424

2525
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
26-
from pytorch_lightning.utilities import _module_available, rank_zero_only, rank_zero_warn
26+
from pytorch_lightning.utilities.imports import _module_available
2727
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _flatten_dict
28+
from pytorch_lightning.utilities.rank_zero import rank_zero_only, rank_zero_warn
2829

2930
log = logging.getLogger(__name__)
3031
LOCAL_FILE_URI_PREFIX = "file:"

pytorch_lightning/loggers/neptune.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,10 +32,10 @@
3232
from pytorch_lightning import __version__
3333
from pytorch_lightning.callbacks.model_checkpoint import ModelCheckpoint
3434
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
35-
from pytorch_lightning.utilities import rank_zero_only
3635
from pytorch_lightning.utilities.imports import _NEPTUNE_AVAILABLE, _NEPTUNE_GREATER_EQUAL_0_9
3736
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _sanitize_callable_params
3837
from pytorch_lightning.utilities.model_summary import ModelSummary
38+
from pytorch_lightning.utilities.rank_zero import rank_zero_only
3939

4040
if _NEPTUNE_AVAILABLE and _NEPTUNE_GREATER_EQUAL_0_9:
4141
try:

pytorch_lightning/loggers/tensorboard.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,10 +29,11 @@
2929
import pytorch_lightning as pl
3030
from pytorch_lightning.core.saving import save_hparams_to_yaml
3131
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
32-
from pytorch_lightning.utilities import _OMEGACONF_AVAILABLE, rank_zero_only, rank_zero_warn
3332
from pytorch_lightning.utilities.cloud_io import get_filesystem
33+
from pytorch_lightning.utilities.imports import _OMEGACONF_AVAILABLE
3434
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _flatten_dict
3535
from pytorch_lightning.utilities.logger import _sanitize_params as _utils_sanitize_params
36+
from pytorch_lightning.utilities.rank_zero import rank_zero_only, rank_zero_warn
3637

3738
log = logging.getLogger(__name__)
3839

pytorch_lightning/loggers/test_tube.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,9 +20,9 @@
2020

2121
import pytorch_lightning as pl
2222
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
23-
from pytorch_lightning.utilities import _module_available, rank_zero_deprecation, rank_zero_warn
24-
from pytorch_lightning.utilities.distributed import rank_zero_only
23+
from pytorch_lightning.utilities import _module_available
2524
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _flatten_dict
25+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_only, rank_zero_warn
2626

2727
_TESTTUBE_AVAILABLE = _module_available("test_tube")
2828

pytorch_lightning/loggers/wandb.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,11 +26,10 @@
2626

2727
from pytorch_lightning.callbacks.model_checkpoint import ModelCheckpoint
2828
from pytorch_lightning.loggers.base import LightningLoggerBase, rank_zero_experiment
29-
from pytorch_lightning.utilities import _module_available, rank_zero_only
3029
from pytorch_lightning.utilities.exceptions import MisconfigurationException
31-
from pytorch_lightning.utilities.imports import _compare_version
30+
from pytorch_lightning.utilities.imports import _compare_version, _module_available
3231
from pytorch_lightning.utilities.logger import _add_prefix, _convert_params, _flatten_dict, _sanitize_callable_params
33-
from pytorch_lightning.utilities.warnings import rank_zero_warn
32+
from pytorch_lightning.utilities.rank_zero import rank_zero_only, rank_zero_warn
3433

3534
_WANDB_AVAILABLE = _module_available("wandb")
3635
_WANDB_GREATER_EQUAL_0_10_22 = _compare_version("wandb", operator.ge, "0.10.22")

pytorch_lightning/loops/epoch/training_epoch_loop.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,14 +23,14 @@
2323
from pytorch_lightning.loops.utilities import _get_active_optimizers, _is_max_limit_reached
2424
from pytorch_lightning.trainer.connectors.logger_connector.result import _ResultCollection
2525
from pytorch_lightning.trainer.progress import BatchProgress, SchedulerProgress
26-
from pytorch_lightning.utilities import rank_zero_warn
2726
from pytorch_lightning.utilities.apply_func import apply_to_collection
2827
from pytorch_lightning.utilities.auto_restart import _collect_states_on_rank_zero_over_collection
2928
from pytorch_lightning.utilities.exceptions import MisconfigurationException
3029
from pytorch_lightning.utilities.fetching import AbstractDataFetcher, DataLoaderIterDataFetcher
3130
from pytorch_lightning.utilities.model_helpers import is_overridden
31+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_warn
3232
from pytorch_lightning.utilities.signature_utils import is_param_in_hook_signature
33-
from pytorch_lightning.utilities.warnings import rank_zero_deprecation, WarningCache
33+
from pytorch_lightning.utilities.warnings import WarningCache
3434

3535
_OUTPUTS_TYPE = List[_BATCH_OUTPUTS_TYPE]
3636

pytorch_lightning/loops/fit_loop.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,11 +22,10 @@
2222
from pytorch_lightning.trainer.connectors.logger_connector.result import _ResultCollection
2323
from pytorch_lightning.trainer.progress import Progress
2424
from pytorch_lightning.trainer.supporters import TensorRunningAccum
25-
from pytorch_lightning.utilities import rank_zero_deprecation
2625
from pytorch_lightning.utilities.enums import _FaultTolerantMode
2726
from pytorch_lightning.utilities.exceptions import MisconfigurationException
2827
from pytorch_lightning.utilities.model_helpers import is_overridden
29-
from pytorch_lightning.utilities.warnings import rank_zero_warn
28+
from pytorch_lightning.utilities.rank_zero import rank_zero_deprecation, rank_zero_warn
3029

3130
log = logging.getLogger(__name__)
3231

0 commit comments

Comments
 (0)