Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问什么时候会支持对CogVLM2的量化 #1902

Open
EasonGZY opened this issue Jul 3, 2024 · 3 comments
Open

请问什么时候会支持对CogVLM2的量化 #1902

EasonGZY opened this issue Jul 3, 2024 · 3 comments
Assignees

Comments

@EasonGZY
Copy link

EasonGZY commented Jul 3, 2024

您好,请问什么时候会支持对CogVLM2的量化,模型来自zhipu的https://huggingface.co/THUDM/cogvlm2-llama3-chat-19B,能否用https://github.com/InternLM/lmdeploy/blob/main/docs/en/quantization/w4a16.md来辅助进行量化,谢谢!

@lvhan028
Copy link
Collaborator

lvhan028 commented Jul 3, 2024

0.5.0 has supported cogvlm2. May give it a try.

@lvhan028
Copy link
Collaborator

lvhan028 commented Jul 9, 2024

@AllentDan @grimoire

@lvhan028
Copy link
Collaborator

lvhan028 commented Jul 9, 2024

0.5.0 has supported cogvlm2. May give it a try.

My mistake. 0.5.0 support cogvlm2 but hasn't support its quantization yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants