Skip to content

Ambiguity in transformer_tutorial.py #1037

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
bjourne opened this issue Jun 22, 2020 · 1 comment · Fixed by #2383
Closed

Ambiguity in transformer_tutorial.py #1037

bjourne opened this issue Jun 22, 2020 · 1 comment · Fixed by #2383
Assignees
Labels
docathon-h1-2023 A label for the docathon in H1 2023 easy Text Issues relating to text tutorials

Comments

@bjourne
Copy link

bjourne commented Jun 22, 2020

The text in the tutorial indicates that the batch size is the outermost dimension of the data: "For instance, with the alphabet as the sequence (total length of 26) and a batch size of 4, we would divide the alphabet into 4 sequences of length 6:" But in the code there is a call to .t() so that the sequence length is the outermost dimension.

@holly1238 holly1238 added the Text Issues relating to text tutorials label Jul 27, 2021
@svekars svekars added easy docathon-h1-2023 A label for the docathon in H1 2023 labels May 31, 2023
@QasimKhan5x
Copy link
Contributor

QasimKhan5x commented Jun 1, 2023

\assigntome

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docathon-h1-2023 A label for the docathon in H1 2023 easy Text Issues relating to text tutorials
Projects
None yet
4 participants