Skip to content

AttributeError: 'LightningDistributedModule' object has no attribute 'require_backward_grad_sync' #10080

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sunshuofeng opened this issue Oct 22, 2021 · 4 comments
Labels
bug Something isn't working help wanted Open to be worked on won't fix This will not be worked on

Comments

@sunshuofeng
Copy link

🐛 Bug

When I use manual_backward ,this bug appear 。
I know the new version may fix this, but for some reasons I can only use version 1.2.4, what should I do?

Environment

  • CUDA:
    - GPU:
    - available: True
    - version: 11.1
  • Packages:
    - numpy: 1.21.2
    - pyTorch_debug: False
    - pyTorch_version: 1.8.1+cu111
    - pytorch-lightning: 1.2.4
    - tqdm: 4.62.3
  • System:
    - OS: Linux
    - architecture:
    - 64bit
    -
    - processor: x86_64
    - python: 3.7.11
    - version: ModuleNotFoundError: No module named 'demo' #4 SMP Mon Mar 30 12:42:28 HKT 2020

Thanks for your help!!

@sunshuofeng sunshuofeng added bug Something isn't working help wanted Open to be worked on labels Oct 22, 2021
@kaushikb11
Copy link
Contributor

Hey @sunshuofeng, could you update PyTorch Lightning to the latest?

@sunshuofeng
Copy link
Author

sunshuofeng commented Oct 22, 2021

I wish I could do this, but due to some version conflict issues, I can only use version 1.2

What I want to do is optimize center Loss, so there will be two optimizers,I've tried many other solutions

Optimization of Center Loss requires a coefficient multiplication of the gradient, and an error-free approach is:

def optimizer_step(.......):
       ## optimize model
       if optimizer_idx == 0:
            optimizer.step(closure=optimizer_closure)

       ## optimize center loss
       else:
            optimizer_closure()
            for param in center_loss.parameters():
                     param.grad.data*=weight
           optimizer.step()

But this will propagate forward twice, which is obviously not necessary,So I tried the following:

def optimizer_step(.......):
       if optimizer_idx == 0:
            optimizer,optimizer_center=self.optimizers()
            optimizer_closure()
            for param in center_loss.parameters():
                     param.grad.data*=weight
             optimizer.step()
             optimizer_center.step()

But the error will show that param.grad is None

So I guess I'll just have to use manual optimization

@awaelchli
Copy link
Contributor

Perhaps you could share more about what the version conflict issues are and we could try to help with that?

@stale
Copy link

stale bot commented Nov 22, 2021

This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team!

@stale stale bot added the won't fix This will not be worked on label Nov 22, 2021
@stale stale bot closed this as completed Dec 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on won't fix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants