Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error loading pano.ckpt model #52

Open
srttt opened this issue Sep 25, 2024 · 0 comments
Open

Error loading pano.ckpt model #52

srttt opened this issue Sep 25, 2024 · 0 comments

Comments

@srttt
Copy link

srttt commented Sep 25, 2024

python demo.py --text "This kitchen is a charming blend of rustic and modern, featuring a large reclaimed wood island with marble countertop, a sink surrounded by cabinets. To the left of the island, a stainless-steel refrigerator stands tall. To the right of the sink, built-in wooden cabinets painted in a muted."
/root/autodl-tmp/anaconda3/lib/python3.12/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: clean_up_tokenization_spaces was not set. It will be set to True by default. This behavior will be depracted in transformers v4.45, and will be then set to False by default. For more details check this issue: huggingface/transformers#31884
warnings.warn(
/root/autodl-tmp/code/MVDiffusion/demo.py:72: FutureWarning: You are using torch.load with weights_only=False (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for weights_only will be flipped to True. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via torch.serialization.add_safe_globals. We recommend you start setting weights_only=True for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
model.load_state_dict(torch.load('/root/autodl-tmp/files/pano.ckpt', map_location='cpu')['state_dict'], strict=True)
Traceback (most recent call last):
File "/root/autodl-tmp/code/MVDiffusion/demo.py", line 72, in
model.load_state_dict(torch.load('/root/autodl-tmp/files/pano.ckpt', map_location='cpu')['state_dict'], strict=True)
File "/root/autodl-tmp/anaconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 2215, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for PanoGenerator:
Missing key(s) in state_dict: "vae.encoder.mid_block.attentions.0.to_q.weight", "vae.encoder.mid_block.attentions.0.to_q.bias", "vae.encoder.mid_block.attentions.0.to_k.weight", "vae.encoder.mid_block.attentions.0.to_k.bias", "vae.encoder.mid_block.attentions.0.to_v.weight", "vae.encoder.mid_block.attentions.0.to_v.bias", "vae.encoder.mid_block.attentions.0.to_out.0.weight", "vae.encoder.mid_block.attentions.0.to_out.0.bias", "vae.decoder.mid_block.attentions.0.to_q.weight", "vae.decoder.mid_block.attentions.0.to_q.bias", "vae.decoder.mid_block.attentions.0.to_k.weight", "vae.decoder.mid_block.attentions.0.to_k.bias", "vae.decoder.mid_block.attentions.0.to_v.weight", "vae.decoder.mid_block.attentions.0.to_v.bias", "vae.decoder.mid_block.attentions.0.to_out.0.weight", "vae.decoder.mid_block.attentions.0.to_out.0.bias".
Unexpected key(s) in state_dict: "text_encoder.text_model.embeddings.position_ids", "vae.encoder.mid_block.attentions.0.query.weight", "vae.encoder.mid_block.attentions.0.query.bias", "vae.encoder.mid_block.attentions.0.key.weight", "vae.encoder.mid_block.attentions.0.key.bias", "vae.encoder.mid_block.attentions.0.value.weight", "vae.encoder.mid_block.attentions.0.value.bias", "vae.encoder.mid_block.attentions.0.proj_attn.weight", "vae.encoder.mid_block.attentions.0.proj_attn.bias", "vae.decoder.mid_block.attentions.0.query.weight", "vae.decoder.mid_block.attentions.0.query.bias", "vae.decoder.mid_block.attentions.0.key.weight", "vae.decoder.mid_block.attentions.0.key.bias", "vae.decoder.mid_block.attentions.0.value.weight", "vae.decoder.mid_block.attentions.0.value.bias", "vae.decoder.mid_block.attentions.0.proj_attn.weight", "vae.decoder.mid_block.attentions.0.proj_attn.bias".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant