Improve collision check on hparams between LightningModule and DataModule #9492
Labels
feature
Is an improvement or enhancement
good first issue
Good for newcomers
help wanted
Open to be worked on
let's do it!
approved to implement
Uh oh!
There was an error while loading. Please reload this page.
🚀 Feature
Motivation
With the recent ability to log hyperparameters on the datamodule, this exception was introduced in case the keys had overlaps between the lightning module and datamodule
https://github.com/PyTorchLightning/pytorch-lightning/blob/ec828b826717cd3b5beabcb6d0cacf41b2320a98/pytorch_lightning/trainer/trainer.py#L1043-L1053
However, this check can be overly strict: if the same hparams are shared across the LightningModule and DataModule, this will result in an error.
Pitch
Only raise an exception if there are overlapping keys which have different values across lightning module and datamodule
Alternatives
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
The text was updated successfully, but these errors were encountered: