Skip to content

loop customization docs #9609

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 91 commits into from
Oct 18, 2021
Merged

loop customization docs #9609

merged 91 commits into from
Oct 18, 2021

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Sep 20, 2021

What does this PR do?

Continuation of #9557 with new structure.

TODO:

  • move the final GIFs to the S3 bucket and include via link

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

I made sure I had fun coding 🙃

@awaelchli awaelchli force-pushed the docs/loop-customization-2 branch from ca89b80 to dda835a Compare September 20, 2021 14:07
Copy link
Collaborator

@lantiga lantiga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Big improvement compared to the previous iteration, we can still improve, up to us when to do it. I have left comments inline.

A comment on the graphics: I really like the gifs, but I noticed that except for the first gif, the last frame only persists on screen for a fraction of a second, while it should stay there for a full few seconds. It's really hard to consume it right now.

# train generator, then yield
fake_pred = self.discriminator(fake)
gen_loss = self.criterion(fake_pred, fake_gt)
yield gen_loss
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This example is not easy to follow, I had to go back and forth a few times to grasp it. We should probably make an effort at making it more digestible by avoiding glossing over details, and explaining (in words or visually) how the generator will be advanced once per optimizer.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Luca, I agree it's quite complex. The use case is simple but the implementation at the moment is not so easy. As we will continue to improve access to loop customization, this should become much cleaner. For now, I'm taking the example out of the docs and will create a tutorial example instead here: #9983

Plus, we will add more api docs to the loops, especially for the undocumented methods, so they become accessible through our doc pages.

@tchaton tchaton marked this pull request as ready for review October 15, 2021 15:52
@codecov
Copy link

codecov bot commented Oct 15, 2021

Codecov Report

Merging #9609 (7bfacdb) into master (23450e2) will decrease coverage by 0%.
The diff coverage is 100%.

@@           Coverage Diff           @@
##           master   #9609    +/-   ##
=======================================
- Coverage      93%     93%    -0%     
=======================================
  Files         180     179     -1     
  Lines       15090   15829   +739     
=======================================
+ Hits        14019   14670   +651     
- Misses       1071    1159    +88     

Copy link
Contributor

@edenlightning edenlightning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great!

Just a note, better to replace all references to classes and functions with cross references (:class:~pytorch_lightning.core.optimizer.LightningOptimizer etc)

@mergify mergify bot added the ready PRs ready to be merged label Oct 15, 2021
Copy link
Contributor

@kaushikb11 kaushikb11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@awaelchli awaelchli mentioned this pull request Oct 18, 2021
12 tasks
@awaelchli awaelchli enabled auto-merge (squash) October 18, 2021 09:10
@awaelchli awaelchli merged commit 7a91516 into master Oct 18, 2021
@awaelchli awaelchli deleted the docs/loop-customization-2 branch October 18, 2021 09:43
rohitgr7 pushed a commit to Tshimanga/pytorch-lightning that referenced this pull request Oct 18, 2021
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Carlos Mocholí <[email protected]>
Co-authored-by: thomas chaton <[email protected]>
Co-authored-by: edenlightning <[email protected]>
Comment on lines +32 to +36
val_loss = 0
for i, val_batch in enumerate(val_dataloader):
x, y = val_batch
y_hat = model(x)
val_loss += loss_function(y_hat, y)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks a bit strange running full validation loop for each training step/batch

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we want to enable strange things with loop customization :)) we wouldn't be able to do this so cleanly in pure Lightning. That's the core value haha

optimizer.step()

However, some new research use cases such as meta-learning, active learning, recommendation systems, etc., require a different loop structure.
For example here is a simple loop that guides the weight updates with a loss from a special validation split:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we refer to some literature? this kind of weight update is not very common...

Copy link
Contributor Author

@awaelchli awaelchli Oct 19, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the closest I could think of is bilevel learning: https://arxiv.org/pdf/1809.01465.pdf
What I sketched here in the docs is not exactly that but I couldn't add too many details because the focus should be on the illustration of the loop structure, less on the details of how such an approach would be useful in practice.

Each loop has a series of methods that can be modified.
For example with the :class:`~pytorch_lightning.loops.fit_loop.FitLoop`:

.. code-block::
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.. code-block:: python

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pre-commit/blacken-docs was not happy. couldn't figure out the problem

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but others seem to have it...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, in other sections blacken-docs does not fail with a cryptic error :(
If you find the fix I will buy you a beer/coffee

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Am I eligible for the free beer?

It fails because your forgot the ":" in the on_advance_end def

Also, the advance def should take self for consistency (but this does not make it fail).

🍺🍺🍺

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, thank you very much!
#10036

Send me your home address on slack, I'll ship the beer asap 😃


.. _override default loops:

Overriding the default loops
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just thinking is all loops kind if tested with in the Traner sanity check?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nope, only the validation loop :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants