Skip to content

Provide intra-fit() ckpt_path access to enable additional finetuning modalities #10198

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

speediedan
Copy link
Contributor

@speediedan speediedan commented Oct 27, 2021

What does this PR do?

This PR includes a few minor changes that enable callbacks to restore checkpoints iteratively within the context of Trainer.fit().

An example use case that depends on access to ckpt_path (fit_ckpt_path) within fit() is available in a notebook-based tutorial (lightning-tutorials PR).

Does your PR introduce any breaking changes? If yes, please list them.

None

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Yes :)
Make sure you had fun coding 🙃

@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch 2 times, most recently from a2803c0 to 4c4c747 Compare October 28, 2021 00:45
@speediedan speediedan changed the title Feature/10197 finetuning scheduler callback [WIP] Feature/10197 finetuning scheduler callback Oct 28, 2021
@speediedan speediedan marked this pull request as ready for review October 28, 2021 01:36
@speediedan speediedan changed the title Feature/10197 finetuning scheduler callback Feature/10197 finetuning scheduler callback [WIP] Oct 28, 2021
@speediedan speediedan changed the title Feature/10197 finetuning scheduler callback [WIP] Feature/10197 finetuning scheduler callback Oct 28, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from d7da388 to ef67620 Compare October 28, 2021 19:36
speediedan added a commit to speediedan/lightning_sphinx_theme that referenced this pull request Oct 29, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from ef67620 to 349f73f Compare October 29, 2021 19:33
speediedan added a commit to speediedan/lightning_sphinx_theme that referenced this pull request Oct 31, 2021
speediedan added a commit to speediedan/lightning_sphinx_theme that referenced this pull request Oct 31, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from 349f73f to f302153 Compare October 31, 2021 03:00
speediedan added a commit to speediedan/lightning_sphinx_theme that referenced this pull request Oct 31, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from f302153 to 1c77531 Compare October 31, 2021 03:22
@mergify mergify bot added the has conflicts label Nov 1, 2021
@tchaton tchaton added this to the v1.6 milestone Nov 1, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from 1c77531 to 65fb438 Compare November 1, 2021 20:03
@mergify mergify bot removed the has conflicts label Nov 1, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from 65fb438 to e26b11f Compare November 3, 2021 16:39
@staticmethod
def __apply_mapping_to_param_groups(param_groups: List[Dict[str, Any]], mapping: dict) -> List[Dict[str, Any]]:
def _apply_mapping_to_param_groups(param_groups: List[Dict[str, Any]], mapping: dict) -> List[Dict[str, Any]]:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Requesting this method be changed to protected instead of private since it has proved useful for FinetuningScheduler and I can conceive of it being useful in other contexts.

Comment on lines +761 to +766
self.fit_ckpt_path = self.__set_ckpt_path(
ckpt_path, model_provided=model, model_connected=self.lightning_module is not None
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This attribute is needed for FinetuningScheduler and I think it could make sense for other potential extensions as well.

# add trivial test parameter to convblock to validate parent (non-leaf) module parameter handling
self.parent_param = nn.Parameter(torch.zeros((1), dtype=torch.float))
self.bn = nn.BatchNorm2d(out_channels)
class ConvBlockParam(nn.Module):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FinetuningScheduler uses both ConvBlockParam and ConvBlock for testing and I think it's sufficiently useful elsewhere to have the two classes defined outside the complex_model test.

@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from f3a478e to ee9c4c8 Compare December 7, 2021 20:25
@mergify mergify bot added the has conflicts label Dec 9, 2021
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from ee9c4c8 to 5064ddb Compare December 12, 2021 23:38
@mergify mergify bot removed the has conflicts label Dec 12, 2021
…ver installing packages unrequired outside of submodule usage)
…ria, user-specified epoch transitions or a composition of the two (the default mode). Also enables parameters to be fully specified or selected via regex.
…tribute and EarlyStopping modification. FTSEarlyStopping extension of EarlyStopping added. FTS dependencies managed by a callbackresolvermixin
…n to move it to the hub. leaving the PR open for a few minor changes required to support user registered callbacks like the finetuning_scheduler
@speediedan speediedan force-pushed the feature/10197_finetuning_scheduler_callback branch from 5064ddb to 30a985a Compare January 28, 2022 00:27
@mergify mergify bot removed the has conflicts label Jan 28, 2022
@speediedan speediedan changed the title Feature/10197 finetuning scheduler callback Provide intra-fit() ckpt_path access to enable additional finetuning modalities Feb 1, 2022
@speediedan speediedan closed this Feb 1, 2022
@speediedan
Copy link
Contributor Author

since the scope of this PR has changed so much, to maximize clarity I've opened a replacement PR

@speediedan speediedan deleted the feature/10197_finetuning_scheduler_callback branch May 12, 2022 19:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants