Well, I am interested in designing a WGAN using pytorch-lightning, but feel trouble with limiting generator gradient clip and discriminator not clip. #9307
Unanswered
forechoandlook
asked this question in
code help: CV
Replies: 1 comment
-
for now, the internal gradient clipping procedures will clip gradients for both the generator and discriminator, but you can configure it yourself by disabling the one from lightning for now and configuring it manually def on_before_optimizer_step(self, trainer, pl_module, optimizer, opt_idx):
if opt_idx == 0: # assuming that opt_idx=0 is for generator
params = ...
torch.nn.utils.clip_grad_norm_(params, clip_val) we will add support for it in the future :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I do not want to use manual optimizer in LightningModule .
Beta Was this translation helpful? Give feedback.
All reactions