You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
after using gptscript for a while with github-hosted tools, I start hitting this:
2024/05/04 08:38:36 failed resolving github.com/gptscript-ai/search/duckduckgo at ./get-blog-posts-search.gpt: failed to GitHub commit of gptscript-ai/search at HEAD: 403 Forbidden {"message":"API rate limit exceeded for <my-ip>. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://docs.github.com/rest/overview/resources-in-the-rest-api#rate-limiting"}
At first I thought this was duckduckgo rate limiting me, but then I realised it was github when resolving the module itself!
Can we cache the result of whatever we're using the github API for?
The text was updated successfully, but these errors were encountered:
after using gptscript for a while with github-hosted tools, I start hitting this:
At first I thought this was duckduckgo rate limiting me, but then I realised it was github when resolving the module itself!
Can we cache the result of whatever we're using the github API for?
The text was updated successfully, but these errors were encountered: