Skip to content

Move ipu precision flag check to IPUPrecisionPlugin init #12148

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Mar 5, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 12 additions & 1 deletion pytorch_lightning/plugins/precision/ipu.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,20 @@


class IPUPrecisionPlugin(PrecisionPlugin):
"""Precision plugin for IPU integration."""
"""Precision plugin for IPU integration.

Raises:
ValueError:
If the precision is neither 16 nor 32.
"""

def __init__(self, precision: int) -> None:
supported_precision_values = (16, 32)
if precision not in supported_precision_values:
raise ValueError(
f"`Trainer(accelerator='ipu', precision={precision!r})` is not supported."
f" `precision` must be one of: {supported_precision_values}."
Comment on lines +41 to +42
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So the reason we had the check in the accelerator connector is because the error message is not accurate when instantiating the plugin (or a subclass of it) outside the Trainer. So yeah, not ideal to move it here :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@awaelchli do you mean that user may init the precision plugin by themselves? Are you suggesting keep this in accl_connt?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you mean that user may init the precision plugin by themselves?

in general yes, that's what the plugins are meant to be used for. For the IPU precision specifically, nobody will. Still, the error message making a reference to the Trainer is misleading.

)
super().__init__()
self.precision = precision

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -685,12 +685,6 @@ def _check_and_init_precision(self) -> PrecisionPlugin:

def _validate_precision_choice(self) -> None:
"""Validate the combination of choices for precision, AMP type, and accelerator."""
# TODO: change exception type to ImpactableConfigurationException
if isinstance(self.accelerator, IPUAccelerator):
if self._precision_flag not in (16, 32):
raise MisconfigurationException(
f"`Trainer(accelerator='ipu', precision={self._precision_flag!r})` is not supported."
)
if isinstance(self.accelerator, TPUAccelerator):
if self._precision_flag == 64:
raise MisconfigurationException(
Expand Down
4 changes: 2 additions & 2 deletions tests/accelerators/test_accelerator_connector.py
Original file line number Diff line number Diff line change
Expand Up @@ -929,9 +929,9 @@ def test_unsupported_ipu_choice(mock_ipu_acc_avail, monkeypatch):

monkeypatch.setattr(imports, "_IPU_AVAILABLE", True)
monkeypatch.setattr(ipu, "_IPU_AVAILABLE", True)
with pytest.raises(MisconfigurationException, match=r"accelerator='ipu', precision='bf16'\)` is not supported"):
with pytest.raises(ValueError, match=r"accelerator='ipu', precision='bf16'\)` is not supported"):
Trainer(accelerator="ipu", precision="bf16")
with pytest.raises(MisconfigurationException, match=r"accelerator='ipu', precision=64\)` is not supported"):
with pytest.raises(ValueError, match=r"accelerator='ipu', precision=64\)` is not supported"):
Trainer(accelerator="ipu", precision=64)


Expand Down