Skip to content

Set devices to 1 when it's just Trainer(accelerator='auto') #10192

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 9 commits into from

Conversation

kaushikb11
Copy link
Contributor

What does this PR do?

Set devices to 1 when it's just Trainer(accelerator='auto')

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@justusschock
Copy link
Member

Shouldn't we use all available devices for that type of device then?

@kaushikb11
Copy link
Contributor Author

kaushikb11 commented Oct 27, 2021

@justusschock @tchaton

Then what would be the difference between Trainer(accelerator="auto") and Trainer(accelerator="auto", devices="auto") as we would support it soon.

@mergify mergify bot added the ready PRs ready to be merged label Oct 28, 2021
@awaelchli awaelchli added this to the v1.5 milestone Oct 28, 2021
@mergify mergify bot removed the has conflicts label Nov 1, 2021
@kaushikb11
Copy link
Contributor Author

kaushikb11 commented Nov 1, 2021

I am putting this PR on hold. IMO, if a user just does Trainer(accelerator="auto"), we should crash it and ask them to pass devices=x or gpus if it had GPUs.(like we do right now)

Rather than assigning only one device for them.
For this particular reason, we support devices="auto". Your thoughts?

@PyTorchLightning/core-contributors

@Lightning-AI Lightning-AI deleted a comment from codecov bot Nov 1, 2021
@awaelchli awaelchli modified the milestones: v1.5, v1.6 Nov 1, 2021
@carmocca
Copy link
Contributor

carmocca commented Nov 1, 2021

It could also default to devices="auto" if only accelerator="auto" was passed.

@carmocca carmocca marked this pull request as draft November 2, 2021 15:46
@Lightning-AI Lightning-AI deleted a comment from github-actions bot Nov 2, 2021
@ananthsub
Copy link
Contributor

ananthsub commented Nov 3, 2021

@kaushikb11 where is the discussion/issue/RFC describing what these PRs are trying to accomplish? it is very hard to review this without that background.

@akihironitta
Copy link
Contributor

akihironitta commented Nov 11, 2021

I'm not aware of the context outside this PR at all, but I don't think we should accept this change. If it were more common to use only one device out of multiple devices than using all of them, it'd be fine, but it's not common, right?

In my opinion, it should default to devices="auto" as carmocca suggests because I think that's what users (at least I) expect, or raise an exception asking users to specify devices as kaushikb11 suggests.

@kaushikb11 kaushikb11 added discussion In a discussion stage and removed ready PRs ready to be merged labels Nov 11, 2021
@codecov
Copy link

codecov bot commented Nov 11, 2021

Codecov Report

Merging #10192 (ac50b55) into master (305a42c) will decrease coverage by 4%.
The diff coverage is 50%.

@@           Coverage Diff            @@
##           master   #10192    +/-   ##
========================================
- Coverage      92%      89%    -4%     
========================================
  Files         181      181            
  Lines       16424    16424            
========================================
- Hits        15178    14545   -633     
- Misses       1246     1879   +633     

@tchaton
Copy link
Contributor

tchaton commented Nov 18, 2021

IMO, I believe accelerator="auto" should be the default and devices become a required argument.

@kaushikb11
Copy link
Contributor Author

devices is set to auto when it's not passed. Closing the PR.

@kaushikb11 kaushikb11 closed this Mar 22, 2022
@rohitgr7 rohitgr7 deleted the fix/set_devices_one branch March 22, 2022 08:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working discussion In a discussion stage
Projects
None yet
Development

Successfully merging this pull request may close these issues.