Skip to content

Commit 2c887cf

Browse files
authored
Do not prepare lr scheduler as it as the right number of steps (#24088)
* Do not prepare lr scheduler as it as the right number of steps * Trigger CI * Trigger CI * Trigger CI * Add fake comment * Remove fake comment * Trigger CI please!
1 parent 12298cb commit 2c887cf

File tree

1 file changed

+2
-3
lines changed

1 file changed

+2
-3
lines changed

Diff for: src/transformers/trainer.py

+2-3
Original file line numberDiff line numberDiff line change
@@ -1747,9 +1747,7 @@ def _inner_training_loop(
17471747

17481748
# prepare using `accelerator` prepare
17491749
if use_accelerator_prepare:
1750-
model, self.optimizer, self.lr_scheduler = self.accelerator.prepare(
1751-
self.model, self.optimizer, self.lr_scheduler
1752-
)
1750+
model, self.optimizer = self.accelerator.prepare(self.model, self.optimizer)
17531751

17541752
if self.is_fsdp_enabled:
17551753
self.model = model
@@ -1996,6 +1994,7 @@ def _inner_training_loop(
19961994
optimizer_was_run = scale_before <= scale_after
19971995
else:
19981996
self.optimizer.step()
1997+
optimizer_was_run = not self.accelerator.optimizer_step_was_skipped
19991998

20001999
if optimizer_was_run:
20012000
# Delay optimizer scheduling until metrics are generated

0 commit comments

Comments
 (0)