We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
InternLM-Xcomposer2.5 just got released and its performance is really amazing.
I try to use lmdeploy to accelerate the inference, but failed. Is there any plan to support it?
I would appreciate it very much, and I think it can benefit the community a lot, too.
I tried to run xcomposer2.5 with lmdeploy, but I came across this problem. I guess lmdeploy does not support xcomposer2.5 inference now.
No response
The text was updated successfully, but these errors were encountered:
Yes, we havn't support it yet and will support soon.
Sorry, something went wrong.
Happy to know that! I'm really looking forward to the support. Thanks very much.
irexyc
No branches or pull requests
Motivation
InternLM-Xcomposer2.5 just got released and its performance is really amazing.
I try to use lmdeploy to accelerate the inference, but failed. Is there any plan to support it?
I would appreciate it very much, and I think it can benefit the community a lot, too.
Related resources
I tried to run xcomposer2.5 with lmdeploy, but I came across this problem. I guess lmdeploy does not support xcomposer2.5 inference now.
Additional context
No response
The text was updated successfully, but these errors were encountered: