-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: module 'transformers_modules.InternVL2-2B-1epoch.tokenization_internlm2' has no attribute 'InternLM2Tokenizer' #1663
Comments
稳定复现吗?这个错误之前有人报过了,但我们一直不好复现 |
多机多卡的时候会报这种错误。尝试降级到transformers==4.37.2后还是会报错。 |
Traceback (most recent call last): MiniCPM-V-2_6也会报这个错误 |
这个问题有解决方案吗?多机多卡总是报错(10次里面9次报这个错误,1次能成功),每次报错崩了就得重新排队了。 |
这个问题还是没有解决哦。总是报AttributeError: module 'transformers_modules.InternVL2-2B-1epoch.tokenization_internlm2' has no attribute 'InternLM2Tokenizer'这种错误。就算把模型的名称改成InternVL2-2B同名,也会报错。 |
torchrun
--nnodes $ARNOLD_WORKER_NUM
--node_rank $ARNOLD_ID
--master_addr $METIS_WORKER_0_HOST
--nproc_per_node $ARNOLD_WORKER_GPU
--master_port $port
examples/pytorch/llm/llm_sft.py
--model_type 'internvl2-2b'
--model_id_or_path $BASE_PATH/playground/models/InternVL2-2B-1epoch
--sft_type 'lora'
--tuner_backend 'peft'
--template_type 'AUTO'
--dtype 'AUTO'
用上面的训练脚本跑InternVL2-2B的模型训练会爆下面的错误是怎么回事?怎么解决?
AttributeError: module 'transformers_modules.InternVL2-2B-1epoch.tokenization_internlm2' has no attribute 'InternLM2Tokenizer'
tokenizer_class = get_class_from_dynamic_module(class_ref, pretrained_model_name_or_path, **kwargs)
File "/home/tiger/.local/lib/python3.9/site-packages/transformers/dynamic_module_utils.py", line 500, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "/home/tiger/.local/lib/python3.9/site-packages/transformers/dynamic_module_utils.py", line 201, in get_class_in_module
return getattr(module, class_name)
AttributeError: module 'transformers_modules.InternVL2-2B-1epoch.tokenization_internlm2' has no attribute 'InternLM2Tokenizer'
The text was updated successfully, but these errors were encountered: