Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: unexpected keys in state_dict when loading RIFE model #383

Open
YANG-success-last opened this issue Dec 16, 2024 · 4 comments

Comments

@YANG-success-last
Copy link

在推理时,报错显示 权重加载时参数丢失。
使用以下代码加载模型:

from model.RIFE import Model
model = Model()
model.load_model("train_log", -1)
执行脚本后报错如下Traceback (most recent call last):
File "/home/liran001/RIFE/inference_img.py", line 29, in <module>
 model.load_model(params["modelDir"], -1)  # 加载权重
File "/home/liran001/RIFE/model/RIFE.py", line 55, in load_model
 self.flownet.load_state_dict(convert(torch.load('{}/flownet.pkl'.format(path))))
File "/home/liran001/anaconda3/envs/pytorch/lib/python3.9/site-packages/torch/nn/modules/module.py", line 2584, in load_state_dict
 raise RuntimeError(
RuntimeError: Error(s) in loading state_dict for IFNet:
     Missing key(s) in state_dict: "block0.conv0.0.0.weight", "block0.conv0.0.0.bias", "block0.conv0.0.1.weight", "block0.conv0.1.0.weight", "block0.conv0.1.0.bias", "block0.conv0.1.1.weight", "block0.convblock.0.0.weight", "block0.convblock.0.0.bias", "block0.convblock.0.1.weight", "block0.convblock.1.0.weight", "block0.convblock.1.0.bias", "block0.convblock.1.1.weight", "block0.convblock.2.0.weight", "block0.convblock.2.0.bias", "block0.convblock.2.1.weight", "block0.convblock.3.0.weight", "block0.convblock.3.0.bias", "block0.convblock.3.1.weight", "block0.convblock.4.0.weight", "block0.convblock.4.0.bias", "block0.convblock.4.1.weight", "block0.convblock.5.0.weight", "block0.convblock.5.0.bias", "block0.convblock.5.1.weight", "block0.convblock.6.0.weight", "block0.convblock.6.0.bias", "block0.convblock.6.1.weight", "block0.convblock.7.0.weight", "block0.convblock.7.0.bias", "block0.convblock.7.1.weight", "block0.lastconv.weight", "block0.lastconv.bias", "block1.conv0.0.0.weight", "block1.conv0.0.0.bias", "block1.conv0.0.1.weight", "block1.conv0.1.0.weight", "block1.conv0.1.0.bias", "block1.conv0.1.1.weight", "block1.convblock.0.0.weight", "block1.convblock.0.0.bias", "block1.convblock.0.1.weight", "block1.convblock.1.0.weight", "block1.convblock.1.0.bias", "block1.convblock.1.1.weight", "block1.convblock.2.0.weight", "block1.convblock.2.0.bias", "block1.convblock.2.1.weight", "block1.convblock.3.0.weight", "block1.convblock.3.0.bias", "block1.convblock.3.1.weight", "block1.convblock.4.0.weight", "block1.convblock.4.0.bias", "block1.convblock.4.1.weight", "block1.convblock.5.0.weight", "block1.convblock.5.0.bias", "block1.convblock.5.1.weight", "block1.convblock.6.0.weight", "block1.convblock.6.0.bias", "block1.convblock.6.1.weight", "block1.convblock.7.0.weight", "block1.convblock.7.0.bias", "block1.convblock.7.1.weight", "block1.lastconv.weight", "block1.lastconv.bias", "block2.conv0.0.0.weight", "block2.conv0.0.0.bias", "block2.conv0.0.1.weight", "block2.conv0.1.0.weight", "block2.conv0.1.0.bias", "block2.conv0.1.1.weight", "block2.convblock.0.0.weight", "block2.convblock.0.0.bias", "block2.convblock.0.1.weight", "block2.convblock.1.0.weight", "block2.convblock.1.0.bias", "block2.convblock.1.1.weight", "block2.convblock.2.0.weight", "block2.convblock.2.0.bias", "block2.convblock.2.1.weight", "block2.convblock.3.0.weight", "block2.convblock.3.0.bias", "block2.convblock.3.1.weight", "block2.convblock.4.0.weight", "block2.convblock.4.0.bias", "block2.convblock.4.1.weight", "block2.convblock.5.0.weight", "block2.convblock.5.0.bias", "block2.convblock.5.1.weight", "block2.convblock.6.0.weight", "block2.convblock.6.0.bias", "block2.convblock.6.1.weight", "block2.convblock.7.0.weight", "block2.convblock.7.0.bias", "block2.convblock.7.1.weight", "block2.lastconv.weight", "block2.lastconv.bias", "block_tea.conv0.0.0.weight", "block_tea.conv0.0.0.bias", "block_tea.conv0.0.1.weight", "block_tea.conv0.1.0.weight", "block_tea.conv0.1.0.bias", "block_tea.conv0.1.1.weight", "block_tea.convblock.0.0.weight", "block_tea.convblock.0.0.bias", "block_tea.convblock.0.1.weight", "block_tea.convblock.1.0.weight", "block_tea.convblock.1.0.bias", "block_tea.convblock.1.1.weight", "block_tea.convblock.2.0.weight", "block_tea.convblock.2.0.bias", "block_tea.convblock.2.1.weight", "block_tea.convblock.3.0.weight", "block_tea.convblock.3.0.bias", "block_tea.convblock.3.1.weight", "block_tea.convblock.4.0.weight", "block_tea.convblock.4.0.bias", "block_tea.convblock.4.1.weight", "block_tea.convblock.5.0.weight", "block_tea.convblock.5.0.bias", "block_tea.convblock.5.1.weight", "block_tea.convblock.6.0.weight", "block_tea.convblock.6.0.bias", "block_tea.convblock.6.1.weight", "block_tea.convblock.7.0.weight", "block_tea.convblock.7.0.bias", "block_tea.convblock.7.1.weight", "block_tea.lastconv.weight", "block_tea.lastconv.bias", "contextnet.conv1.conv1.0.weight", "contextnet.conv1.conv1.0.bias", "contextnet.conv1.conv1.1.weight", "contextnet.conv1.conv2.0.weight", "contextnet.conv1.conv2.0.bias", "contextnet.conv1.conv2.1.weight", "contextnet.conv2.conv1.0.weight", "contextnet.conv2.conv1.0.bias", "contextnet.conv2.conv1.1.weight", "contextnet.conv2.conv2.0.weight", "contextnet.conv2.conv2.0.bias", "contextnet.conv2.conv2.1.weight", "contextnet.conv3.conv1.0.weight", "contextnet.conv3.conv1.0.bias", "contextnet.conv3.conv1.1.weight", "contextnet.conv3.conv2.0.weight", "contextnet.conv3.conv2.0.bias", "contextnet.conv3.conv2.1.weight", "contextnet.conv4.conv1.0.weight", "contextnet.conv4.conv1.0.bias", "contextnet.conv4.conv1.1.weight", "contextnet.conv4.conv2.0.weight", "contextnet.conv4.conv2.0.bias", "contextnet.conv4.conv2.1.weight", "unet.down0.conv1.0.weight", "unet.down0.conv1.0.bias", "unet.down0.conv1.1.weight", "unet.down0.conv2.0.weight", "unet.down0.conv2.0.bias", "unet.down0.conv2.1.weight", "unet.down1.conv1.0.weight", "unet.down1.conv1.0.bias", "unet.down1.conv1.1.weight", "unet.down1.conv2.0.weight", "unet.down1.conv2.0.bias", "unet.down1.conv2.1.weight", "unet.down2.conv1.0.weight", "unet.down2.conv1.0.bias", "unet.down2.conv1.1.weight", "unet.down2.conv2.0.weight", "unet.down2.conv2.0.bias", "unet.down2.conv2.1.weight", "unet.down3.conv1.0.weight", "unet.down3.conv1.0.bias", "unet.down3.conv1.1.weight", "unet.down3.conv2.0.weight", "unet.down3.conv2.0.bias", "unet.down3.conv2.1.weight", "unet.up0.0.weight", "unet.up0.0.bias", "unet.up0.1.weight", "unet.up1.0.weight", "unet.up1.0.bias", "unet.up1.1.weight", "unet.up2.0.weight", "unet.up2.0.bias", "unet.up2.1.weight", "unet.up3.0.weight", "unet.up3.0.bias", "unet.up3.1.weight", "unet.conv.weight", "unet.conv.bias". 
@hzwer
Copy link
Owner

hzwer commented Dec 16, 2024

那 torch.load('{}/flownet.pkl'.format(path)).keys() 会得到什么?

@YANG-success-last
Copy link
Author

Keys in the loaded state_dict:
odict_keys(['block0.conv0.0.0.weight', 'block0.conv0.0.0.bias', 'block0.conv0.0.1.weight', 'block0.conv0.1.0.weight', 'block0.conv0.1.0.bias', 'block0.conv0.1.1.weight', 'block0.convblock.0.0.weight', 'block0.convblock.0.0.bias', 'block0.convblock.0.1.weight', 'block0.convblock.1.0.weight', 'block0.convblock.1.0.bias', 'block0.convblock.1.1.weight', 'block0.convblock.2.0.weight', 'block0.convblock.2.0.bias', 'block0.convblock.2.1.weight', 'block0.convblock.3.0.weight', 'block0.convblock.3.0.bias', 'block0.convblock.3.1.weight', 'block0.convblock.4.0.weight', 'block0.convblock.4.0.bias', 'block0.convblock.4.1.weight', 'block0.convblock.5.0.weight', 'block0.convblock.5.0.bias', 'block0.convblock.5.1.weight', 'block0.convblock.6.0.weight', 'block0.convblock.6.0.bias', 'block0.convblock.6.1.weight', 'block0.convblock.7.0.weight', 'block0.convblock.7.0.bias', 'block0.convblock.7.1.weight', 'block0.lastconv.weight', 'block0.lastconv.bias', 'block1.conv0.0.0.weight', 'block1.conv0.0.0.bias', 'block1.conv0.0.1.weight', 'block1.conv0.1.0.weight', 'block1.conv0.1.0.bias', 'block1.conv0.1.1.weight', 'block1.convblock.0.0.weight', 'block1.convblock.0.0.bias', 'block1.convblock.0.1.weight', 'block1.convblock.1.0.weight', 'block1.convblock.1.0.bias', 'block1.convblock.1.1.weight', 'block1.convblock.2.0.weight', 'block1.convblock.2.0.bias', 'block1.convblock.2.1.weight', 'block1.convblock.3.0.weight', 'block1.convblock.3.0.bias', 'block1.convblock.3.1.weight', 'block1.convblock.4.0.weight', 'block1.convblock.4.0.bias', 'block1.convblock.4.1.weight', 'block1.convblock.5.0.weight', 'block1.convblock.5.0.bias', 'block1.convblock.5.1.weight', 'block1.convblock.6.0.weight', 'block1.convblock.6.0.bias', 'block1.convblock.6.1.weight', 'block1.convblock.7.0.weight', 'block1.convblock.7.0.bias', 'block1.convblock.7.1.weight', 'block1.lastconv.weight', 'block1.lastconv.bias', 'block2.conv0.0.0.weight', 'block2.conv0.0.0.bias', 'block2.conv0.0.1.weight', 'block2.conv0.1.0.weight', 'block2.conv0.1.0.bias', 'block2.conv0.1.1.weight', 'block2.convblock.0.0.weight', 'block2.convblock.0.0.bias', 'block2.convblock.0.1.weight', 'block2.convblock.1.0.weight', 'block2.convblock.1.0.bias', 'block2.convblock.1.1.weight', 'block2.convblock.2.0.weight', 'block2.convblock.2.0.bias', 'block2.convblock.2.1.weight', 'block2.convblock.3.0.weight', 'block2.convblock.3.0.bias', 'block2.convblock.3.1.weight', 'block2.convblock.4.0.weight', 'block2.convblock.4.0.bias', 'block2.convblock.4.1.weight', 'block2.convblock.5.0.weight', 'block2.convblock.5.0.bias', 'block2.convblock.5.1.weight', 'block2.convblock.6.0.weight', 'block2.convblock.6.0.bias', 'block2.convblock.6.1.weight', 'block2.convblock.7.0.weight', 'block2.convblock.7.0.bias', 'block2.convblock.7.1.weight', 'block2.lastconv.weight', 'block2.lastconv.bias', 'block_tea.conv0.0.0.weight', 'block_tea.conv0.0.0.bias', 'block_tea.conv0.0.1.weight', 'block_tea.conv0.1.0.weight', 'block_tea.conv0.1.0.bias', 'block_tea.conv0.1.1.weight', 'block_tea.convblock.0.0.weight', 'block_tea.convblock.0.0.bias', 'block_tea.convblock.0.1.weight', 'block_tea.convblock.1.0.weight', 'block_tea.convblock.1.0.bias', 'block_tea.convblock.1.1.weight', 'block_tea.convblock.2.0.weight', 'block_tea.convblock.2.0.bias', 'block_tea.convblock.2.1.weight', 'block_tea.convblock.3.0.weight', 'block_tea.convblock.3.0.bias', 'block_tea.convblock.3.1.weight', 'block_tea.convblock.4.0.weight', 'block_tea.convblock.4.0.bias', 'block_tea.convblock.4.1.weight', 'block_tea.convblock.5.0.weight', 'block_tea.convblock.5.0.bias', 'block_tea.convblock.5.1.weight', 'block_tea.convblock.6.0.weight', 'block_tea.convblock.6.0.bias', 'block_tea.convblock.6.1.weight', 'block_tea.convblock.7.0.weight', 'block_tea.convblock.7.0.bias', 'block_tea.convblock.7.1.weight', 'block_tea.lastconv.weight', 'block_tea.lastconv.bias', 'contextnet.conv1.conv1.0.weight', 'contextnet.conv1.conv1.0.bias', 'contextnet.conv1.conv1.1.weight', 'contextnet.conv1.conv2.0.weight', 'contextnet.conv1.conv2.0.bias', 'contextnet.conv1.conv2.1.weight', 'contextnet.conv2.conv1.0.weight', 'contextnet.conv2.conv1.0.bias', 'contextnet.conv2.conv1.1.weight', 'contextnet.conv2.conv2.0.weight', 'contextnet.conv2.conv2.0.bias', 'contextnet.conv2.conv2.1.weight', 'contextnet.conv3.conv1.0.weight', 'contextnet.conv3.conv1.0.bias', 'contextnet.conv3.conv1.1.weight', 'contextnet.conv3.conv2.0.weight', 'contextnet.conv3.conv2.0.bias', 'contextnet.conv3.conv2.1.weight', 'contextnet.conv4.conv1.0.weight', 'contextnet.conv4.conv1.0.bias', 'contextnet.conv4.conv1.1.weight', 'contextnet.conv4.conv2.0.weight', 'contextnet.conv4.conv2.0.bias', 'contextnet.conv4.conv2.1.weight', 'unet.down0.conv1.0.weight', 'unet.down0.conv1.0.bias', 'unet.down0.conv1.1.weight', 'unet.down0.conv2.0.weight', 'unet.down0.conv2.0.bias', 'unet.down0.conv2.1.weight', 'unet.down1.conv1.0.weight', 'unet.down1.conv1.0.bias', 'unet.down1.conv1.1.weight', 'unet.down1.conv2.0.weight', 'unet.down1.conv2.0.bias', 'unet.down1.conv2.1.weight', 'unet.down2.conv1.0.weight', 'unet.down2.conv1.0.bias', 'unet.down2.conv1.1.weight', 'unet.down2.conv2.0.weight', 'unet.down2.conv2.0.bias', 'unet.down2.conv2.1.weight', 'unet.down3.conv1.0.weight', 'unet.down3.conv1.0.bias', 'unet.down3.conv1.1.weight', 'unet.down3.conv2.0.weight', 'unet.down3.conv2.0.bias', 'unet.down3.conv2.1.weight', 'unet.up0.0.weight', 'unet.up0.0.bias', 'unet.up0.1.weight', 'unet.up1.0.weight', 'unet.up1.0.bias', 'unet.up1.1.weight', 'unet.up2.0.weight', 'unet.up2.0.bias', 'unet.up2.1.weight', 'unet.up3.0.weight', 'unet.up3.0.bias', 'unet.up3.1.weight', 'unet.conv.weight', 'unet.conv.bias'])

@hzwer
Copy link
Owner

hzwer commented Dec 16, 2024

我不理解,看起来模型参数和 pkl 中是一致的 #PyTorch模型加载错误解决方法# 来自跃问分享 https://yuewen.cn/share/179863116760768512?utm_source=share&utm_content=web_linkcopy&version=2
很奇怪的问题

@YANG-success-last
Copy link
Author

好的非常感谢,问题出在没有正确将模型参数导入推理中,已解决。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants