Skip to content

feat: load model from modelscope #1283

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Feb 17, 2025

Conversation

suluyana
Copy link
Contributor

Hello,

First of all, thank you very much for your group's high-quality open-source LLM model compression/quantization toolkit. In this pull request, I have added an environment variable called "GPTQMODEL_USE_MODELSCOPE". With this environment variable, Chinese users can directly retrieve models from ModelScope, which is significantly faster.

ModelScope is the largest model community in China. It serves as a premier platform designed to bridge model checkpoints with model applications, offering the essential infrastructure to facilitate the sharing of open models and promote model-centric development. For more information, visit our GitHub page: ModelScope.

Copy link
Collaborator

@Qubitium Qubitium left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@suluyana ModelScope is great for the China ai community. Thank you for the PR.

Only some nitpicks.

@Qubitium
Copy link
Collaborator

@suluyana All good!

@Qubitium Qubitium merged commit d397991 into ModelCloud:main Feb 17, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants