-
Notifications
You must be signed in to change notification settings - Fork 24.4k
Replace all CHECK_ and DCHECK_ with TORCH_* macros #82032
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful links
❌ 1 New FailuresAs of commit c3db99b (more details on the Dr. CI page): Expand to see more
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages
|
@wconstab has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
2 similar comments
@wconstab has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@wconstab has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
OSS CI failures seem transient/unrelated: pull / linux-bionic-cuda11.6-py3.7-gcc7 / test (functorch, 1, 1, linux.4xlarge.nvidia.gpu)
pull / linux-focal-rocm5.2-py3.7 / build
might wait on being able to run internal CI to land if I can, but otherwise looks ready to land. |
@wconstab has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
Thanks for providing this support @wconstab! My local build of PyTorch/XLA is unblocked by this change (i.e. running into new errors unrelated to these macros - lol).
Reference PyTorch/XLA PR: pytorch/xla#3745
CC @JackCaoG
cd {pytorch root} find aten c10 caffe2 modules test torch -type f -exec sed -i 's/DCHECK_EQ/TORCH_DCHECK_EQ/g' {} \;
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_EQ/ TORCH_CHECK_EQ/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_NE/ TORCH_CHECK_NE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_LE/ TORCH_CHECK_LE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_LT/ TORCH_CHECK_LT/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_GE/ TORCH_CHECK_GE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ CHECK_GT/ TORCH_CHECK_GT/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ DCHECK_NE/ TORCH_DCHECK_NE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ DCHECK_LE/ TORCH_DCHECK_LE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ DCHECK_LT/ TORCH_DCHECK_LT/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ DCHECK_GE/ TORCH_DCHECK_GE/g' {} \; lintrunner -a
find aten c10 caffe2 modules test torch -type f -exec sed -i 's/ DCHECK_GT/ TORCH_DCHECK_GT/g' {} \; lintrunner -a
Hey @wconstab. |
Summary: Avoid exposing defines that conflict with google logging, since this blocks external usage of libtorch in certain cases. All the 'interesting' changes should be in these two files, and the rest should just be mechanical changes via sed. c10/util/logging_is_not_google_glog.h c10/util/logging_is_google_glog.h Fixes #81415 cc miladm malfet Pull Request resolved: #82032 Approved by: https://github.com/soumith, https://github.com/miladm Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/4f34cd6d1e91dcd82ee30c3ea39bdb8a0fa93e8b Original Phabricator Test Plan: Imported from GitHub, without a `Test Plan:` line. Reviewed By: osalpekar Differential Revision: D38180841 Pulled By: wconstab fbshipit-source-id: b2f6c3fef609ec6ad616929a16d342e525ef0155
Summary: Pull Request resolved: #2582 CHECK_ were deprecated in upstream so we should replace them here as well Similar to pytorch/vision#6322, relates to pytorch/pytorch#82032 Signed-off-by: Eli Uriegas <[email protected]> Test Plan: Imported from OSS Reviewed By: malfet, mthrok Differential Revision: D38208356 Pulled By: seemethere fbshipit-source-id: 6f42d517362f415e0775803514eee2628402918f
Summary: Pull Request resolved: pytorch#2582 CHECK_ were deprecated in upstream so we should replace them here as well Similar to pytorch/vision#6322, relates to pytorch/pytorch#82032 Signed-off-by: Eli Uriegas <[email protected]> Test Plan: Imported from OSS Reviewed By: malfet, mthrok Differential Revision: D38208356 Pulled By: seemethere fbshipit-source-id: 6f42d517362f415e0775803514eee2628402918f
Hi @wconstab , I have the impression that the cc @datumbox |
+1 This caused issues on some of the tests on TorchVision which used the idiom: You can see additional info at pytorch/vision#6322 (review) |
Summary: Resolves https://www.internalfb.com/tasks/?t=128004042 Caused by TorchVision's PR at #6322 which was in response to a change on PyTorch Core at pytorch/pytorch#82032 Reviewed By: fmassa Differential Revision: D38383266 fbshipit-source-id: 3f0ebd04a610031c4720123d1869a851f76455cd
Summary: Resolves https://www.internalfb.com/tasks/?t=128004042 Caused by TorchVision's PR at pytorch#6322 which was in response to a change on PyTorch Core at pytorch/pytorch#82032 Reviewed By: fmassa Differential Revision: D38383266 fbshipit-source-id: 3f0ebd04a610031c4720123d1869a851f76455cd
Summary: Resolves https://www.internalfb.com/tasks/?t=128004042 Caused by TorchVision's PR at #6322 which was in response to a change on PyTorch Core at pytorch/pytorch#82032 Reviewed By: fmassa Differential Revision: D38383266 fbshipit-source-id: 3f0ebd04a610031c4720123d1869a851f76455cd
Hey @fmassa, sorry for the trouble. A bit of background first, which someone may correct me on as I am not familiar with the history or intention behind this. PyTorch is sometimes built with google log (glog) and sometimes without it. logging_is_not_google_glog.h defines CHECK macros that are compiled away in release builds. My diff simply renames CHECK_ to TORCH_CHECK here. logging_is_google_glog.h defines CHECK_ so that it always runs with or without release builds. My diff changes the semantics here, making them match So, the question I would have is, going forward do we want to have different behavior for TORCH_CHECK depending on whether we use glog under the hood, or do we want consistent behavior? I'm missing the context of why the inconsistency was there in the first place- perhaps it was intentional, or perhaps it was an oversight. cc @ezyang |
After chatting with @ezyang I realized that this diff incorrectly set CHECK_ macros to be optimized out in the 'is_glog' file, while in the 'is_not_glog' file only the DCHECK macros get optimized out. I have opened #83216 to fix this. Additionally, we were discussing the merits of maintaining TORCH_CHECK_* and TORCH_DCHECK_* macros in the first place, which are implemented as aborts under the hood in contrast with the |
Makes TORCH_CHECK_* run unconditionally, leaving only TORCH_DCHECK_* special-cased to be optimized out in release builds. Fixes a bug in #82032, relating to this comment #82032 (comment) Pull Request resolved: #83216 Approved by: https://github.com/ezyang, https://github.com/datumbox
Summary: Makes TORCH_CHECK_* run unconditionally, leaving only TORCH_DCHECK_* special-cased to be optimized out in release builds. Fixes a bug in #82032, relating to this comment #82032 (comment) Pull Request resolved: #83216 Approved by: https://github.com/ezyang, https://github.com/datumbox Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/abb2204f6a9c5af4e14e11cc69de8bcb5cceaea0 Reviewed By: seemethere Differential Revision: D38624296 Pulled By: wconstab fbshipit-source-id: 23f8db4b3cad7ba9d2105a33b0d40736326f7c4a
Building master fails with the following: ``` pytorch/caffe2/contrib/nccl/cuda_nccl_gpu.cc:180:51: error: 'CHECK_NOTNULL' was not declared in this scope; did you mean 'TORCH_CHECK_NOTNULL'? 180 | CUDA_ENFORCE(cudaStreamWaitEvent(CHECK_NOTNULL(ex.stream), event, 0)); ``` Seems like #82032 just missed one find-replace. cc @wconstab Not sure why this wouldn't have been caught elsewhere. Pull Request resolved: #84720 Approved by: https://github.com/wconstab
Building master fails with the following: ``` pytorch/caffe2/contrib/nccl/cuda_nccl_gpu.cc:180:51: error: 'CHECK_NOTNULL' was not declared in this scope; did you mean 'TORCH_CHECK_NOTNULL'? 180 | CUDA_ENFORCE(cudaStreamWaitEvent(CHECK_NOTNULL(ex.stream), event, 0)); ``` Seems like #82032 just missed one find-replace. cc @wconstab Not sure why this wouldn't have been caught elsewhere. Pull Request resolved: #84720 Approved by: https://github.com/wconstab
Building master fails with the following: ``` pytorch/caffe2/contrib/nccl/cuda_nccl_gpu.cc:180:51: error: 'CHECK_NOTNULL' was not declared in this scope; did you mean 'TORCH_CHECK_NOTNULL'? 180 | CUDA_ENFORCE(cudaStreamWaitEvent(CHECK_NOTNULL(ex.stream), event, 0)); ``` Seems like pytorch#82032 just missed one find-replace. cc @wconstab Not sure why this wouldn't have been caught elsewhere. Pull Request resolved: pytorch#84720 Approved by: https://github.com/wconstab
Summary: Building master fails with the following: ``` pytorch/caffe2/contrib/nccl/cuda_nccl_gpu.cc:180:51: error: 'CHECK_NOTNULL' was not declared in this scope; did you mean 'TORCH_CHECK_NOTNULL'? 180 | CUDA_ENFORCE(cudaStreamWaitEvent(CHECK_NOTNULL(ex.stream), event, 0)); ``` Seems like #82032 just missed one find-replace. cc wconstab Not sure why this wouldn't have been caught elsewhere. Pull Request resolved: #84720 Approved by: https://github.com/wconstab Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/0fd8f6b93cb3d1342a10ef71d4b27356f0dfc9b1 Reviewed By: izaitsevfb Differential Revision: D39407045 fbshipit-source-id: 5e14964d966ddf819459d01e3b357abd6632d72d
Avoid exposing defines that conflict with google logging, since this blocks external usage of libtorch in certain cases.
All the 'interesting' changes should be in these two files, and the rest should just be mechanical changes via sed.
c10/util/logging_is_not_google_glog.h
c10/util/logging_is_google_glog.h
Fixes #81415
cc @miladm @malfet