Num_of_step increase infinitely but num_of_epoch stuck 0 with dynamic batch size dataset #18174
Unanswered
Lingeng56
asked this question in
code help: NLP / ASR / TTS
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey, I implemented a dynamic batch size dataset implement data batching procedure in dataset class rather than using sampler.
My dataset
(Functions <shuffle, sort, batch> in __batched_data will all return a generator)
My dataloader
During training, I find my progress bar can't get the total number of steps of an epoch, and this was to my expected. But during training, number of step increased infinitely and epoch stuck to 0.

Maybe anyone faced same problem, and could give me some suggestions...
Beta Was this translation helpful? Give feedback.
All reactions