Skip to content

Typing for accelerators and plugins #7022

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 31 commits into from
Apr 15, 2021
Merged

Conversation

carmocca
Copy link
Contributor

@carmocca carmocca commented Apr 14, 2021

What does this PR do?

Part of #7035

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • [n/a] Did you make sure to update the documentation with your changes? (if necessary)
  • [n/a] Did you write any new necessary tests? (not for typos and docs)
  • [n/a] Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

@carmocca carmocca added this to the 1.3 milestone Apr 14, 2021
@carmocca carmocca self-assigned this Apr 14, 2021
@carmocca carmocca marked this pull request as draft April 14, 2021 20:52
@carmocca carmocca mentioned this pull request Apr 15, 2021
23 tasks
@tchaton tchaton mentioned this pull request Apr 15, 2021
@carmocca carmocca changed the title [Blocked by #7015] Typing for accelerators and plugins Typing for accelerators and plugins Apr 15, 2021
grad_clip_val = float(clip_val)
if grad_clip_val <= 0:
return

max_norm = grad_clip_val
parameters = self.model.parameters()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this equivalent to what we had before?

@codecov
Copy link

codecov bot commented Apr 15, 2021

Codecov Report

Merging #7022 (e005c44) into master (f645df5) will decrease coverage by 5%.
The diff coverage is 85%.

@@           Coverage Diff           @@
##           master   #7022    +/-   ##
=======================================
- Coverage      92%     87%    -5%     
=======================================
  Files         194     194            
  Lines       12386   12428    +42     
=======================================
- Hits        11414   10774   -640     
- Misses        972    1654   +682     

@carmocca carmocca marked this pull request as ready for review April 15, 2021 12:55
@carmocca carmocca enabled auto-merge (squash) April 15, 2021 13:05
@carmocca carmocca merged commit f29ecbf into master Apr 15, 2021
@carmocca carmocca deleted the typing-accelerators-plugins branch April 15, 2021 16:48
@carmocca carmocca mentioned this pull request Apr 27, 2021
8 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants