Skip to content

The .train(False) or .eval() does not turn off the gradient computing #1756

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sonack opened this issue Dec 6, 2021 · 7 comments · Fixed by #2659
Closed

The .train(False) or .eval() does not turn off the gradient computing #1756

sonack opened this issue Dec 6, 2021 · 7 comments · Fixed by #2659

Comments

@sonack
Copy link

sonack commented Dec 6, 2021

net.train(False) # Don't need to track gradents for validation

and
net.train(True) # Turn gradients back on for training

The comments is wrong maybe?

cc @suraj813 @aaronenyeshi @chaekit @sekyondaMeta @svekars @carljparker @NicolasHug @kit1980 @subramen @robieta

@neuralninja27
Copy link
Contributor

/assigntome

@Killpit
Copy link

Killpit commented Jun 6, 2023

/assigntome

@github-actions
Copy link

github-actions bot commented Jun 6, 2023

The issue is already assigned. Please pick an opened and unnasigned issue with the docathon-h1-2023 label.

@svekars
Copy link
Contributor

svekars commented Oct 24, 2023

This issue has been unassigned due to inactivity. If you are still planning to work on this, you can still send a PR referencing this issue.

@svekars svekars added docathon-h2-2023 and removed docathon-h1-2023 A label for the docathon in H1 2023 labels Oct 30, 2023
@zabboud
Copy link
Contributor

zabboud commented Nov 1, 2023

/assigntome

@svekars
Copy link
Contributor

svekars commented Nov 7, 2023

This issue has been unassigned due to inactivity. If you are working on this issue, assign it to yourself and send a PR ASAP.

@Viditagarwal7479
Copy link
Contributor

/assigntome

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants