Skip to content

You've made too many requests, please wait one minute and try again. #145

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
insoxin opened this issue Dec 5, 2022 · 2 comments
Closed

Comments

@insoxin
Copy link

insoxin commented Dec 5, 2022

Request ID: ..
Please contact us through our help center
if this issue persists.
Return to homepage

@hallacy
Copy link
Collaborator

hallacy commented Dec 5, 2022

Hi @insoxin, thanks for writing in!

This error looks like our system working as intended to rate limit users. Is there a different response you are expecting?

borisdayma pushed a commit to borisdayma/openai-python that referenced this issue Dec 7, 2022
* overload output type depending on stream literal (openai#142)

* Bump to v22

* [numpy] change version (openai#143)

* [numpy] change version

* update comments

* no version for numpy

* Fix timeouts (openai#137)

* Fix timeouts

* Rename to request_timeout and add to readme

* Dev/hallacy/request timeout takes tuples (openai#144)

* Add tuple typing for request_timeout

* imports

* [api_requestor] Log request_id with response (openai#145)

* Only import wandb as needed (openai#146)

Co-authored-by: Felipe Petroski Such <[email protected]>
Co-authored-by: Henrique Oliveira Pinto <[email protected]>
Co-authored-by: Rachel Lim <[email protected]>
@hallacy
Copy link
Collaborator

hallacy commented Dec 11, 2022

I'm going to close this due to inactivity. Please let me know if there's additional information!

@hallacy hallacy closed this as completed Dec 11, 2022
PeterParkette pushed a commit to PeterParkette/openai-python that referenced this issue Mar 31, 2023
cgayapr pushed a commit to cgayapr/openai-python that referenced this issue Dec 14, 2024
* overload output type depending on stream literal (openai#142)

* Bump to v22

* [numpy] change version (openai#143)

* [numpy] change version

* update comments

* no version for numpy

* Fix timeouts (openai#137)

* Fix timeouts

* Rename to request_timeout and add to readme

* Dev/hallacy/request timeout takes tuples (openai#144)

* Add tuple typing for request_timeout

* imports

* [api_requestor] Log request_id with response (openai#145)

* Only import wandb as needed (openai#146)

Co-authored-by: Felipe Petroski Such <[email protected]>
Co-authored-by: Henrique Oliveira Pinto <[email protected]>
Co-authored-by: Rachel Lim <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants