-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
swift可以训练量化之后的模型吗,比如modelscope里面的awq或者gptq量化之后的模型 #1472
Comments
如果训练完之后,可以merge lora 吗 |
可以训练,但是训练完之后没办法merge-lora. |
能用full的方式训练吗 |
这种情况下我理解应该就只能用QLORA训练了。因为单纯的lora训练就已经是加载16位了,而量化后的awq就是int4了,这样训练的精度就对应不上了。full全量训练应该就不更行了,这是我的理解 |
@Jintao-Huang 直接用默认的lora训练awq模型吗,默认的lora加载的是16/32位把? |
支持的,QLoRA训练 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
因为在训练的时候,加载没有量化的模型,训练完再量化资源不够。可以直接训练官方量化后的模型吗
The text was updated successfully, but these errors were encountered: