Skip to content

Finetuning scheduler #115

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 59 commits into from
May 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
b419e4d
several recommended fixes for lightning_examples/finetuning-scheduler…
speediedan Nov 18, 2021
441104a
notebook-based fts tutorial and example
speediedan Nov 30, 2021
2dfc99f
tweak md format to H2 and update meta info
speediedan Nov 30, 2021
b692250
update tutorial reqs
speediedan Nov 30, 2021
9595b14
stick to markdown w/o html wherever possible, remove temporary fts re…
speediedan Dec 1, 2021
33366fe
Update lightning_examples/finetuning-scheduler/finetuning-scheduler.py
speediedan Dec 2, 2021
3d0fcfe
additional tuning literature references and several additional recomm…
speediedan Dec 2, 2021
d13d9d9
misc tutorial enhancements
speediedan Dec 4, 2021
eb7e095
numerous expository enhancements and switch to DeBERTav3
speediedan Dec 8, 2021
542980b
minor spelling and stylistic updates
speediedan Dec 11, 2021
65b8eed
Merge branch 'main' into finetuning_scheduler
Borda Dec 13, 2021
9d207f4
Merge branch 'main' into finetuning_scheduler
tchaton Dec 17, 2021
c99e169
several recommended fixes for lightning_examples/finetuning-scheduler…
speediedan Nov 18, 2021
b89c5d0
notebook-based fts tutorial and example
speediedan Nov 30, 2021
3a42523
tweak md format to H2 and update meta info
speediedan Nov 30, 2021
7ddbcd4
update tutorial reqs
speediedan Nov 30, 2021
0020f25
stick to markdown w/o html wherever possible, remove temporary fts re…
speediedan Dec 1, 2021
b250845
Update lightning_examples/finetuning-scheduler/finetuning-scheduler.py
speediedan Dec 2, 2021
8077909
additional tuning literature references and several additional recomm…
speediedan Dec 2, 2021
d1429fa
misc tutorial enhancements
speediedan Dec 4, 2021
3fc3887
numerous expository enhancements and switch to DeBERTav3
speediedan Dec 8, 2021
8d32874
minor spelling and stylistic updates
speediedan Dec 11, 2021
9722e22
Apply suggestions from code review
speediedan Dec 17, 2021
7dbae67
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 17, 2021
f17be58
Merge remote-tracking branch 'origin/finetuning_scheduler' into finet…
speediedan Dec 17, 2021
2367eba
removed instantiate_registered_class to simplify example, improved do…
speediedan Dec 17, 2021
37e44c2
enhance linkcheck configuration and switch two images to be base64 en…
speediedan Jan 7, 2022
46822a1
Merge branch 'main' into finetuning_scheduler
speediedan Jan 7, 2022
a209683
enabled setting base learning rates on a per-phase basis, updating cu…
speediedan Jan 13, 2022
2737b34
Merge branch 'main' into finetuning_scheduler
speediedan Jan 13, 2022
7b25d48
clarify finetuning schedule definition configuration options and upda…
speediedan Jan 13, 2022
822ce2f
update notebook author metadata
speediedan Jan 22, 2022
1b4a8fe
add links to finetuning-scheduler rtd documentation site
speediedan Jan 23, 2022
c4a3dd9
update finetuning_scheduler dependency sha
speediedan Jan 27, 2022
5190a96
Merge branch 'main' into finetuning_scheduler
speediedan Feb 9, 2022
98c55df
updated tutorial with initial version of finetuning-scheduler dependency
speediedan Feb 10, 2022
611e8e7
Merge remote-tracking branch 'upstream/main' into finetuning_scheduler
speediedan Feb 18, 2022
cdc332a
update finetuning-scheduler commit sha, simplify notebook dependency …
speediedan Feb 18, 2022
bc1829b
Merge branch 'main' into finetuning_scheduler
speediedan Mar 29, 2022
64ef270
sync finetuning_scheduler tutorial for pl 1.6.0rc1
speediedan Mar 29, 2022
7f6c7ff
update commit sha to point to fts with official pl 1.6.0 release
speediedan Mar 30, 2022
59a6e36
Merge branch 'main' into finetuning_scheduler
speediedan Apr 8, 2022
835d15e
require only finetuning-scheduler dependency now that it is available…
speediedan Apr 8, 2022
827ba47
Merge branch 'main' into finetuning_scheduler
speediedan Apr 18, 2022
f6ec983
update fts documentation to point to stable version (0.1.1)
speediedan Apr 18, 2022
e7c3838
implement workaround for cuda toolkit build difference of torch and t…
speediedan Apr 19, 2022
1d6ef3f
Merge branch 'main' into finetuning_scheduler
speediedan Apr 19, 2022
4e0715e
isort skip directive required to implement sentencepiece workaround
speediedan Apr 19, 2022
31d3cb2
update broken gan/datamodules tutorial links (#164)
speediedan Apr 21, 2022
b339d3a
Merge branch 'main' into finetuning_scheduler
speediedan Apr 21, 2022
fc679d2
Merge branch 'main' into finetuning_scheduler
Borda Apr 22, 2022
d4e3678
Apply suggestions from code review
speediedan May 10, 2022
be5810f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 10, 2022
d150588
additional clarification of documentation and minor cleanup pre-publish
speediedan May 10, 2022
9fd27ce
Merge branch 'main' into finetuning_scheduler
speediedan May 10, 2022
f8ec6df
metadata and aesthetic image improvements
speediedan May 10, 2022
b68dae6
Merge branch 'main' into finetuning_scheduler
speediedan May 11, 2022
3cecc75
make notebook accelerator agnostic
speediedan May 11, 2022
e73f443
reset lr_scheduler interval to epoch
speediedan May 11, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,11 @@
# (source start file, name, description, authors, manual section).
man_pages = [(master_doc, project, project + " Documentation", [author], 1)]

# -- Options for linkcheck builder ----------------------------------------------
# regex pattern 0: allow linking to a specific selection state in
# tensorboard.dev links while continuing to validate the base experiment link
linkcheck_anchors_ignore = ["scalars.*&runSelectionState.*"]

# -- Options for Texinfo output ----------------------------------------------

# Grouping the document tree into Texinfo files. List of tuples
Expand Down
19 changes: 19 additions & 0 deletions lightning_examples/finetuning-scheduler/.meta.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
title: Finetuning Scheduler
author: "[Dan Dale](https://github.com/speediedan)"
created: 2021-11-29
updated: 2022-05-10
license: CC BY-SA
build: 3
tags:
- finetuning
description: |
This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to finetune a small foundational model on the
[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of
[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified
schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
requirements:
- finetuning-scheduler[examples]
accelerator:
- GPU
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@

0:
params:
- model.classifier.bias
- model.classifier.weight
- model.pooler.dense.bias
- model.pooler.dense.weight
- model.deberta.encoder.LayerNorm.bias
- model.deberta.encoder.LayerNorm.weight
- model.deberta.encoder.rel_embeddings.weight
- model.deberta.encoder.layer.{0,11}.(output|attention|intermediate).*
1:
params:
- model.deberta.embeddings.LayerNorm.bias
- model.deberta.embeddings.LayerNorm.weight
2:
params:
- model.deberta.embeddings.word_embeddings.weight
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
711 changes: 711 additions & 0 deletions lightning_examples/finetuning-scheduler/finetuning-scheduler.py

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.