Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Is there any plan to support for InternLM-XComposer2.5 inference? #1920

Open
Charles-Xie opened this issue Jul 4, 2024 · 2 comments
Assignees

Comments

@Charles-Xie
Copy link

Charles-Xie commented Jul 4, 2024

Motivation

InternLM-Xcomposer2.5 just got released and its performance is really amazing.

I try to use lmdeploy to accelerate the inference, but failed. Is there any plan to support it?

I would appreciate it very much, and I think it can benefit the community a lot, too.

Related resources

image

I tried to run xcomposer2.5 with lmdeploy, but I came across this problem. I guess lmdeploy does not support xcomposer2.5 inference now.

Additional context

No response

@irexyc
Copy link
Collaborator

irexyc commented Jul 4, 2024

Yes, we havn't support it yet and will support soon.

@Charles-Xie
Copy link
Author

Yes, we havn't support it yet and will support soon.

Happy to know that! I'm really looking forward to the support. Thanks very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants