Skip to content

[BUG] Update usage of deprecated amp APIs #3331

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
svekars opened this issue Apr 16, 2025 · 2 comments
Closed

[BUG] Update usage of deprecated amp APIs #3331

svekars opened this issue Apr 16, 2025 · 2 comments
Assignees
Labels
amp Issues relating to the automatic mixed precision tutorial bug

Comments

@svekars
Copy link
Contributor

svekars commented Apr 16, 2025

Add Link

https://pytorch.org/docs/stable/amp.html

Describe the bug

See the link above for the list of deprecated APIs.

We need to fix:

FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.

And same for:

torch.cuda.amp.autocast(args...) and torch.cpu.amp.autocast(args...) will be deprecated. Please use torch.autocast("cuda", args...) or torch.autocast("cpu", args...) instead.

torch.cuda.amp.GradScaler(args...) and torch.cpu.amp.GradScaler(args...) will be deprecated. Please use torch.GradScaler("cuda", args...) or torch.GradScaler("cpu", args...) instead.

Describe your environment

  • PyTorch 2.6 or later
@svekars svekars added amp Issues relating to the automatic mixed precision tutorial bug labels Apr 16, 2025
@svekars svekars changed the title [BUG] torch.cuda.amp.autocast with torch.amp.autocast [BUG] Update usage of deprecated amp APIs Apr 16, 2025
@nirajkamal
Copy link
Contributor

Hi @svekars, I am working on this issue, can you please assign it to me?

@nirajkamal
Copy link
Contributor

nirajkamal commented May 29, 2025

Thank you @svekars, I think the branch is merged with changes for this issue.

Also do you think it's more appropriate to use torch.amp.autocast(...) instead of torch.autocast(....) / torch.amp.GradScaler(...) instead of torch.GradScaler(...)?
in the docs the usage is not consistent.

nWEIdia pushed a commit to nWEIdia/pytorch that referenced this issue Jun 2, 2025
qingyi-yan pushed a commit to qingyi-yan/pytorch that referenced this issue Jun 3, 2025
iupaikov-amd pushed a commit to ROCm/pytorch that referenced this issue Jun 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
amp Issues relating to the automatic mixed precision tutorial bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants