Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] InternLM2MLP.forward() missing 1 required positional argument: 'im_mask' #1847

Open
2 tasks done
jiangjingz opened this issue Jun 25, 2024 · 2 comments
Open
2 tasks done
Assignees

Comments

@jiangjingz
Copy link

Checklist

  • 1. I have searched related issues but cannot get the expected help.
  • 2. The bug has not been fixed in the latest version.

Describe the bug

尝试对internlm_xcomposer_vl_7b做量化时,report以下错误:
TypeError: InternLM2MLP.forward() missing 1 required positional argument: 'im_mask'

Reproduction

lmdeploy lite auto_awq
$HF_MODEL
--calib-dataset 'ptb'
--calib-samples 128
--calib-seqlen 2048
--batch-size 1
--w-bits 4
--w-group-size 128
--search-scale False
--work-dir $WORK_DIR

Environment

python 3.10
torch 2.0.1
lmdeploy 0.4.2

Error traceback

Traceback (most recent call last):
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/bin/lmdeploy", line 8, in <module>
    sys.exit(run())
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/cli/entrypoint.py", line 37, in run
    args.run(args)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/cli/lite.py", line 137, in auto_awq
    auto_awq(**kwargs)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/apis/auto_awq.py", line 96, in auto_awq
    vl_model, model, tokenizer, work_dir = calibrate(model,
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/apis/calibrate.py", line 235, in calibrate
    calib_ctx.calibrate(all_data)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/quantization/calibration.py", line 315, in calibrate
    _ = model(data.to(self.device))
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/manxue.jj/.cache/huggingface/modules/transformers_modules/trained/modeling_internlm2.py", line 929, in forward
    layer_outputs = decoder_layer(
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/quantization/calibration.py", line 505, in _forward
    auto_scale_block(mod, batch_kwargs[i], self.w_bits,
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/quantization/calibration.py", line 407, in auto_scale_block
    _auto_get_scale(
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/quantization/calibration.py", line 400, in _auto_get_scale
    best_ratio = _search_module_scale(module2inspect, layers, inp.value,
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/lmdeploy/lite/quantization/calibration.py", line 352, in _search_module_scale
    org_out = block(x, **kwargs)
  File "/data/manxue.jj/anaconda3/envs/py310_torch201/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: InternLM2MLP.forward() missing 1 required positional argument: 'im_mask'
@hekaijie123
Copy link

我发生了同样的问题,当我尝试给出--search-scale 和 --batch-size 参数时候就会出现这个问题。
lmdeploy lite auto_awq internlm-xcomposer2-vl-7b --work-dir fix_vit_newbalancetag-4bit 只有--work-dir 参数则不会出现这个问题。

@AllentDan
Copy link
Collaborator

Fixed in #1890

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants