Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lora keys not loaded for SD3-Medium Lora #3701

Open
bluvoll opened this issue Jun 13, 2024 · 10 comments
Open

Lora keys not loaded for SD3-Medium Lora #3701

bluvoll opened this issue Jun 13, 2024 · 10 comments
Labels
Feature New feature or request

Comments

@bluvoll
Copy link

bluvoll commented Jun 13, 2024

As the title says, Loras trained with Diffusers' scripts can't be loaded with stock Lora Loader

imagen

@comfyanonymous
Copy link
Owner

Can you post a lora file in that format?

@bluvoll
Copy link
Author

bluvoll commented Jun 13, 2024

Can you post a lora file in that format?

Sure thing!
https://files.catbox.moe/e3jy64.safetensors

For the record it was trained with this https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/README_sd3.md

@bluvoll
Copy link
Author

bluvoll commented Jun 13, 2024

@comfyanonymous added another file with just 100 steps, its the final output from diffusers, rank 128 I think this was, so I guess this file should be "complete" as a I noticed the checkpoints might be missing the TENC

https://files.catbox.moe/yrhyl5.safetensors

@mcmonkey4eva mcmonkey4eva added the Feature New feature or request label Jun 13, 2024
@comfyanonymous
Copy link
Owner

ac151ac

Not sure what your lora is supposed to do so let me know if this works.

@bluvoll
Copy link
Author

bluvoll commented Jun 13, 2024

ac151ac

Not sure what your lora is supposed to do so let me know if this works.

Its supposed to be an Anime style, and while it is botched, it loads and changes the image, thank you!

@GavChap
Copy link

GavChap commented Jun 14, 2024

ac151ac

Not sure what your lora is supposed to do so let me know if this works.

@comfyanonymous I've tried a couple of LoRAs with the new code and they still give me the same error as #3701 (comment)

@cryptoquick
Copy link

Same, I've tried with the new code with a couple of LoRAs and nothing changed. Do I need to restart the UI or recompile anything?

@GavChap
Copy link

GavChap commented Jun 14, 2024

Actually I found that some lora trainers use different keys, (simpletuner) it's not the same error, and when I fixed those by detecting those keys as well (I've not done a pull request) I now get a tensor size error.

@cryptoquick
Copy link

Here's a portion of my errors, in case it helps:

lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_in.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_in.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_in.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_out.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_out.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_proj_out.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.alpha
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight

@comfyanonymous
Copy link
Owner

comfyanonymous commented Jun 15, 2024

If you get warnings about _unet_ keys it means that lora is for an older SD version not SD3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants