Skip to content

Commit 66ecaae

Browse files
committed
Remove metrics references from docs
1 parent 2c7c4aa commit 66ecaae

File tree

5 files changed

+2
-13
lines changed

5 files changed

+2
-13
lines changed

docs/source/advanced/multi_gpu.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ This is done by adding ``sync_dist=True`` to all ``self.log`` calls in the valid
9090
This ensures that each GPU worker has the same behaviour when tracking model checkpoints, which is important for later downstream tasks such as testing the best checkpoint across all workers.
9191
The ``sync_dist`` option can also be used in logging calls during the step methods, but be aware that this can lead to significant communication overhead and slow down your training.
9292

93-
Note if you use any built in metrics or custom metrics that use the :doc:`Metrics API <../extensions/metrics>`, these do not need to be updated and are automatically handled for you.
93+
Note if you use any built in metrics or custom metrics that use `TorchMetrics <https://torchmetrics.readthedocs.io/>`_, these do not need to be updated and are automatically handled for you.
9494

9595
.. testcode::
9696

docs/source/extensions/logging.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,7 @@ The :func:`~~pytorch_lightning.core.lightning.LightningModule.log` method has a
111111
.. note::
112112

113113
- Setting ``on_epoch=True`` will cache all your logged values during the full training epoch and perform a
114-
reduction in ``on_train_epoch_end``. We recommend using the :doc:`metrics <../extensions/metrics>` API when working with custom reduction.
114+
reduction in ``on_train_epoch_end``. We recommend using `TorchMetrics <https://torchmetrics.readthedocs.io/>`_, when working with custom reduction.
115115

116116
- Setting both ``on_step=True`` and ``on_epoch=True`` will create two keys per metric you log with
117117
suffix ``_step`` and ``_epoch``, respectively. You can refer to these keys e.g. in the `monitor`

docs/source/extensions/metrics.rst

Lines changed: 0 additions & 9 deletions
This file was deleted.

docs/source/index.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,6 @@ PyTorch Lightning
8484
extensions/callbacks
8585
extensions/datamodules
8686
extensions/logging
87-
extensions/metrics
8887
extensions/plugins
8988
extensions/loops
9089

pyproject.toml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,6 @@ module = [
4444
"pytorch_lightning.core.*",
4545
"pytorch_lightning.loggers.*",
4646
"pytorch_lightning.loops.*",
47-
"pytorch_lightning.metrics.*",
4847
"pytorch_lightning.overrides.*",
4948
"pytorch_lightning.plugins.environments.*",
5049
"pytorch_lightning.plugins.training_type.*",

0 commit comments

Comments
 (0)