Skip to content

Trainer sets incorrect precision for complex dtype #8178

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Damowerko opened this issue Jun 28, 2021 · 0 comments · Fixed by #8208
Closed

Trainer sets incorrect precision for complex dtype #8178

Damowerko opened this issue Jun 28, 2021 · 0 comments · Fixed by #8208
Assignees
Labels
bug Something isn't working help wanted Open to be worked on priority: 2 Low priority task
Milestone

Comments

@Damowerko
Copy link

Damowerko commented Jun 28, 2021

🐛 Bug

Please reproduce using the BoringModel

To Reproduce

Use following BoringModel.

I made the following changes to the BoringModel to reproduce.

  • Registered a complex buffer in __init__ (its name is complex_buffer).
  • Created a trainer with precision=64.
  • Printed out the dtype of the complex buffer.

Expected behavior

The complex buffer should be either cast to complex128 or left alone.

Environment

  • CUDA:
    • GPU:
      • Tesla T4
    • available: True
    • version: 10.2
  • Packages:
    • numpy: 1.19.5
    • pyTorch_debug: False
    • pyTorch_version: 1.9.0+cu102
    • pytorch-lightning: 1.3.7post0
    • tqdm: 4.41.1
  • System:
    • OS: Linux
    • architecture:
      • 64bit
    • processor: x86_64
    • python: 3.7.10
    • version: Proposal for help #1 SMP Sat Jun 5 09:50:34 PDT 2021

Additional context

I think the casting occurs in DoublePrecisionPlugin.connect on this line

@Damowerko Damowerko added bug Something isn't working help wanted Open to be worked on labels Jun 28, 2021
@carmocca carmocca added this to the v1.3.x milestone Jun 29, 2021
@carmocca carmocca added the priority: 2 Low priority task label Jun 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on priority: 2 Low priority task
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants