Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] internvl-chat-v-1-5 predict #1918

Closed
1 of 2 tasks
HalcyonLiang opened this issue Jul 4, 2024 · 4 comments
Closed
1 of 2 tasks

[Bug] internvl-chat-v-1-5 predict #1918

HalcyonLiang opened this issue Jul 4, 2024 · 4 comments
Assignees

Comments

@HalcyonLiang
Copy link

HalcyonLiang commented Jul 4, 2024

Checklist

  • 1. I have searched related issues but cannot get the expected help.
  • 2. The bug has not been fixed in the latest version.

Describe the bug

报错信息:
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/vl_async_engine.py", line 118, in call
return super().call(prompts, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/async_engine.py", line 304, in call
return self.batch_infer(prompts,
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/vl_async_engine.py", line 104, in batch_infer
return super().batch_infer(prompts, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/async_engine.py", line 428, in batch_infer
_get_event_loop().run_until_complete(gather())
File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/async_engine.py", line 425, in gather
await asyncio.gather(
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/async_engine.py", line 410, in _inner_call
async for out in generator:
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/async_engine.py", line 563, in generate
prompt_input = await self._get_prompt_input(prompt, do_preprocess,
File "/usr/local/lib/python3.10/dist-packages/lmdeploy/serve/vl_async_engine.py", line 54, in _get_prompt_input
segs = decorated.split(IMAGE_TOKEN)
AttributeError: 'NoneType' object has no attribute 'split'

报错是prompt出来是个None,但实际pipline都给定了prompt的

Reproduction

用法:
from lmdeploy import pipeline, TurbomindEngineConfig
from lmdeploy.vl import load_image

pipe = pipeline('./internvl')

prompts = [
{
'role': 'user',
'content': [
{'type': 'text', 'text': '你好'},
# {'type': 'image_url', 'image_url': {'url': 'https://raw.githubusercontent.com/open-mmlab/mmdeploy/main/tests/data/tiger.jpeg'}}
]
}
]

response = pipe(prompts)

Environment

lmdeploy 0.5.0 
cuda: 12.3

Error traceback

No response

@irexyc
Copy link
Collaborator

irexyc commented Jul 4, 2024

把文件夹的名字改一下试试 internvl -> InternVL-Chat-V1-5

@HalcyonLiang
Copy link
Author

InternVL-Chat-V1-5

可以了..... 目前是根据文件名判断模型吗

@irexyc
Copy link
Collaborator

irexyc commented Jul 4, 2024

@HalcyonLiang

根据文件名匹配对话模版,这个名字没匹配上。日志的话默认ERROR啥也看不出来..

建议huggingface上面是什么名字,本地就用什么名字好了。

@HalcyonLiang
Copy link
Author

@HalcyonLiang

根据文件名匹配对话模版,这个名字没匹配上。日志的话默认ERROR啥也看不出来..

建议huggingface上面是什么名字,本地就用什么名字好了。

好的,感谢~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants