-
Notifications
You must be signed in to change notification settings - Fork 79
Finetuning scheduler #115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Finetuning scheduler #115
Changes from all commits
Commits
Show all changes
59 commits
Select commit
Hold shift + click to select a range
b419e4d
several recommended fixes for lightning_examples/finetuning-scheduler…
speediedan 441104a
notebook-based fts tutorial and example
speediedan 2dfc99f
tweak md format to H2 and update meta info
speediedan b692250
update tutorial reqs
speediedan 9595b14
stick to markdown w/o html wherever possible, remove temporary fts re…
speediedan 33366fe
Update lightning_examples/finetuning-scheduler/finetuning-scheduler.py
speediedan 3d0fcfe
additional tuning literature references and several additional recomm…
speediedan d13d9d9
misc tutorial enhancements
speediedan eb7e095
numerous expository enhancements and switch to DeBERTav3
speediedan 542980b
minor spelling and stylistic updates
speediedan 65b8eed
Merge branch 'main' into finetuning_scheduler
Borda 9d207f4
Merge branch 'main' into finetuning_scheduler
tchaton c99e169
several recommended fixes for lightning_examples/finetuning-scheduler…
speediedan b89c5d0
notebook-based fts tutorial and example
speediedan 3a42523
tweak md format to H2 and update meta info
speediedan 7ddbcd4
update tutorial reqs
speediedan 0020f25
stick to markdown w/o html wherever possible, remove temporary fts re…
speediedan b250845
Update lightning_examples/finetuning-scheduler/finetuning-scheduler.py
speediedan 8077909
additional tuning literature references and several additional recomm…
speediedan d1429fa
misc tutorial enhancements
speediedan 3fc3887
numerous expository enhancements and switch to DeBERTav3
speediedan 8d32874
minor spelling and stylistic updates
speediedan 9722e22
Apply suggestions from code review
speediedan 7dbae67
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] f17be58
Merge remote-tracking branch 'origin/finetuning_scheduler' into finet…
speediedan 2367eba
removed instantiate_registered_class to simplify example, improved do…
speediedan 37e44c2
enhance linkcheck configuration and switch two images to be base64 en…
speediedan 46822a1
Merge branch 'main' into finetuning_scheduler
speediedan a209683
enabled setting base learning rates on a per-phase basis, updating cu…
speediedan 2737b34
Merge branch 'main' into finetuning_scheduler
speediedan 7b25d48
clarify finetuning schedule definition configuration options and upda…
speediedan 822ce2f
update notebook author metadata
speediedan 1b4a8fe
add links to finetuning-scheduler rtd documentation site
speediedan c4a3dd9
update finetuning_scheduler dependency sha
speediedan 5190a96
Merge branch 'main' into finetuning_scheduler
speediedan 98c55df
updated tutorial with initial version of finetuning-scheduler dependency
speediedan 611e8e7
Merge remote-tracking branch 'upstream/main' into finetuning_scheduler
speediedan cdc332a
update finetuning-scheduler commit sha, simplify notebook dependency …
speediedan bc1829b
Merge branch 'main' into finetuning_scheduler
speediedan 64ef270
sync finetuning_scheduler tutorial for pl 1.6.0rc1
speediedan 7f6c7ff
update commit sha to point to fts with official pl 1.6.0 release
speediedan 59a6e36
Merge branch 'main' into finetuning_scheduler
speediedan 835d15e
require only finetuning-scheduler dependency now that it is available…
speediedan 827ba47
Merge branch 'main' into finetuning_scheduler
speediedan f6ec983
update fts documentation to point to stable version (0.1.1)
speediedan e7c3838
implement workaround for cuda toolkit build difference of torch and t…
speediedan 1d6ef3f
Merge branch 'main' into finetuning_scheduler
speediedan 4e0715e
isort skip directive required to implement sentencepiece workaround
speediedan 31d3cb2
update broken gan/datamodules tutorial links (#164)
speediedan b339d3a
Merge branch 'main' into finetuning_scheduler
speediedan fc679d2
Merge branch 'main' into finetuning_scheduler
Borda d4e3678
Apply suggestions from code review
speediedan be5810f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] d150588
additional clarification of documentation and minor cleanup pre-publish
speediedan 9fd27ce
Merge branch 'main' into finetuning_scheduler
speediedan f8ec6df
metadata and aesthetic image improvements
speediedan b68dae6
Merge branch 'main' into finetuning_scheduler
speediedan 3cecc75
make notebook accelerator agnostic
speediedan e73f443
reset lr_scheduler interval to epoch
speediedan File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
title: Finetuning Scheduler | ||
author: "[Dan Dale](https://github.com/speediedan)" | ||
created: 2021-11-29 | ||
updated: 2022-05-10 | ||
license: CC BY-SA | ||
build: 3 | ||
tags: | ||
- finetuning | ||
description: | | ||
This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension | ||
and demonstrates the use of it to finetune a small foundational model on the | ||
[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of | ||
[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified | ||
schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data | ||
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra. | ||
requirements: | ||
- finetuning-scheduler[examples] | ||
accelerator: | ||
- GPU |
18 changes: 18 additions & 0 deletions
18
lightning_examples/finetuning-scheduler/RteBoolqModule_ft_schedule_deberta_base.yaml
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
|
||
0: | ||
params: | ||
- model.classifier.bias | ||
- model.classifier.weight | ||
- model.pooler.dense.bias | ||
- model.pooler.dense.weight | ||
- model.deberta.encoder.LayerNorm.bias | ||
- model.deberta.encoder.LayerNorm.weight | ||
- model.deberta.encoder.rel_embeddings.weight | ||
- model.deberta.encoder.layer.{0,11}.(output|attention|intermediate).* | ||
1: | ||
params: | ||
- model.deberta.embeddings.LayerNorm.bias | ||
- model.deberta.embeddings.LayerNorm.weight | ||
2: | ||
params: | ||
- model.deberta.embeddings.word_embeddings.weight |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
711 changes: 711 additions & 0 deletions
711
lightning_examples/finetuning-scheduler/finetuning-scheduler.py
Large diffs are not rendered by default.
Oops, something went wrong.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added
BIN
+22.4 KB
lightning_examples/finetuning-scheduler/implicit_training_transition.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.