trainer.validated/tested/predicted_ckpt_path
: mark private or deprecate
#10630
Labels
checkpointing
Related to checkpointing
deprecation
Includes a deprecation
let's do it!
approved to implement
refactor
Proposed refactor / Motivation
As detailed in this comment: #10573 (comment):
These properties are intended as read-only, they do not support any actual functionality (they do not and should not set the load path).
The main reason to keep them is to track the exact ckpt_path used in the case of automatic "best" path resolution (https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/trainer.py#L1363).
Pitch
To clarify/emphasize this, we can either:
Set to private. Also just need one attribute (e.g.
trainer._ckpt_path
ortrainer._last_restored_ckpt_path
).Can we deprecate these attributes entirely? checkpoint_connector gives us info on the exact path here: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/checkpoint_connector.py#L75
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @rohitgr7 @ananthsub @ninginthecloud @tchaton
The text was updated successfully, but these errors were encountered: