-
Notifications
You must be signed in to change notification settings - Fork 6.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lora key not loaded #6381
Comments
Not a bug, you just used a LoRA that is incompatible with the model |
My Lora is trained using Kohya_Ss and can run on web_ui |
My Lora is trained using Kohya_Ss and can run on web_ui |
And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint? |
Yes,my LoRA is for SD1.5 and it use SD1.5 checkpoint.the problem still exists
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月7日(星期二) 晚上11:06
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
My Lora is trained using Kohya_Ss and can run on web_ui
And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
yes,the problem still exist. |
me too,have you work out? |
no,there is still a problem |
If you find a solution, please let me know |
把插件全部更新 然后我就可以用了 你可以试一下 |
我的应该是最新的版本,v0.3.10,这个版本界面上好像没有更新插件的地方吧?你的也是这个版本吗?
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月10日(星期五) 晚上7:09
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
把插件全部更新 然后我就可以用了 你可以试一下
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
我用的叶秋整合包 |
秋叶整合的comfyui吗,我记得好像是整合的webui吧?是在界面上边更新的插件吗
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月10日(星期五) 晚上10:25
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
我用的叶秋整合包
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
都有整合包 我就在启动器更新的 |
我的更新后问题一直存在,就是解决不了了,真是奇怪了
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月10日(星期五) 晚上11:29
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
都有整合包 我就在启动器更新的
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
重装吧 能解决90%的问题 我现在已经正常用了 或者你可以试试先卸载 再重新装 别在原来的基础上更新 |
我重装了最新版本的,还是有原来问题,真是服了。我是原来使用的是webui,在webui上边没有问题,一移稙过来都是问题
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月11日(星期六) 晚上7:13
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
重装吧 能解决90%的问题 我现在已经正常用了 或者你可以试试先卸载 再重新装 别在原来的基础上更新
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
你安装的comfyui是哪个版本的?
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月11日(星期六) 晚上7:13
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
重装吧 能解决90%的问题 我现在已经正常用了 或者你可以试试先卸载 再重新装 别在原来的基础上更新
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
我全部都是最新版本 |
是v0.3.10这样版本吗?
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月11日(星期六) 晚上8:50
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
我全部都是最新版本
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
整合包 是1.4 |
我是这个界面,咱们的界面一样吗
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月11日(星期六) 晚上8:50
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
我全部都是最新版本
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
看不见 |
这次下载附件试一下
…------------------ 原始邮件 ------------------
发件人: "comfyanonymous/ComfyUI" ***@***.***>;
发送时间: 2025年1月11日(星期六) 晚上9:05
***@***.***>;
***@***.******@***.***>;
主题: Re: [comfyanonymous/ComfyUI] lora key not loaded (Issue #6381)
看不见
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
没有 |
Expected Behavior
Requested to load SD1ClipModel
loaded completely 9.5367431640625e+25 235.84423828125 True
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.weight
Requested to load SD1ClipModel
Actual Behavior
no
Steps to Reproduce
no
Debug Logs
Other
No response
The text was updated successfully, but these errors were encountered: