-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Remove pytorch 1.10(dev) CI job #9673
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
75a31ca
to
0403f87
Compare
e3544a8
to
9c76d61
Compare
for more information, see https://pre-commit.ci
I'm confused why this PR is now suddenly running Docker CI jobs like [Edit] Answer: It only runs on PR's that modify these paths (and on all pushes to master): https://github.com/PyTorchLightning/pytorch-lightning/blob/2b2537d9a064cd437978b111360b904438452145/.github/workflows/ci_dockers.yml#L10-L18 |
In other words we don't care about soon future company between PL and PT, meaning actual master nor releasing 1.4.x won't be fully compatible with PT 1.10... |
Hope that by fix you mean real fix not dropping it as this PR =) |
The problem is, I would argue that we already don't care about compatibility between PL and PT 1.10, given that this test has literally been failing since it was introduced. The only way to make people care is to make the job required on PRs, and since we don't want to do that since it changes nightly, the only way to keep our CI green is to delete this job. We will add back testing with 1.10 when there is a stable release candidate, at which point we can make it required. I know it's painful but I don't see a better way, if our goal is a green CI. Do you have any other ideas? |
Can we rather just ignore there few failing test with skip if nightly and add todo note so we have more under control |
I would be fine with that, if we make the nightly job required. Otherwise, the same thing will just happen again, if people can merge PRs with the nightly job failing. |
Cc: @PyTorchLightning/core-contributors |
@daniellepintz I share @Borda 's concerns around removing support for testing PyTorch nightly with Lightning. There could be legitimate bugs we ought to surface by having this testing, and for anyone who's running on pytorch main + lightning main, this test coverage is critical. My understanding is the only 1.10 tests failing are related to quantization with the Given the test failures are targeted, and 1.10 RC is imminent, I'd prefer addressing these test failures specifically first. @Borda historically, have PyTorch + Lightning nightlies have been passing? |
yes for most of the time and if not it was actively resolved on PL side to prevent future incompatibility |
@daniellepintz |
@awaelchli sounds great, closing now! |
What does this PR do?
This job has been failing since it was first introduced in #8133. It is the only job on our CI that consistently fails, and thus is the biggest blocker to having a green CI, which is highly desired. As agreed in the last FB/Grid weekly sync, we will disable this job until there is a release candidate for Pytorch 1.10.
Fixes #9280
Does your PR introduce any breaking changes? If yes, please list them.
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃