Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lora key not loaded #6381

Open
llm8047 opened this issue Jan 7, 2025 · 25 comments
Open

lora key not loaded #6381

llm8047 opened this issue Jan 7, 2025 · 25 comments
Labels
Potential Bug User is reporting a bug. This should be tested.

Comments

@llm8047
Copy link

llm8047 commented Jan 7, 2025

Expected Behavior

Requested to load SD1ClipModel
loaded completely 9.5367431640625e+25 235.84423828125 True
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_0_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_10_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_11_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_1_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_2_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_3_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_4_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_5_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_6_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_7_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_8_self_attn_v_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_mlp_fc2.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_k_proj.weight
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.on_input
lora key not loaded: lora_te_text_model_encoder_layers_9_self_attn_v_proj.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.weight
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.on_input
lora key not loaded: lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.weight
Requested to load SD1ClipModel

Actual Behavior

no

Steps to Reproduce

no

Debug Logs

no

Other

No response

@llm8047 llm8047 added the Potential Bug User is reporting a bug. This should be tested. label Jan 7, 2025
@LukeG89
Copy link

LukeG89 commented Jan 7, 2025

Not a bug, you just used a LoRA that is incompatible with the model

@llm8047
Copy link
Author

llm8047 commented Jan 7, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

@llm8047
Copy link
Author

llm8047 commented Jan 7, 2025

Not a bug, you just used a LoRA that is incompatible with the model

My Lora is trained using Kohya_Ss and can run on web_ui

@LukeG89
Copy link

LukeG89 commented Jan 7, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint?

@llm8047
Copy link
Author

llm8047 commented Jan 8, 2025 via email

@llm8047
Copy link
Author

llm8047 commented Jan 8, 2025

My Lora is trained using Kohya_Ss and can run on web_ui

And you are using the LoRA with a compatible model? For example, the LoRA is for SD1.5 and you are using a SD1.5 checkpoint?

yes,the problem still exist.

@XMZovo
Copy link

XMZovo commented Jan 9, 2025

me too,have you work out?

@llm8047
Copy link
Author

llm8047 commented Jan 9, 2025

me too,have you work out?

no,there is still a problem

@llm8047
Copy link
Author

llm8047 commented Jan 9, 2025

me too,have you work out?

no,there is still a problem

If you find a solution, please let me know

@XMZovo
Copy link

XMZovo commented Jan 10, 2025

把插件全部更新 然后我就可以用了 你可以试一下

@llm8047
Copy link
Author

llm8047 commented Jan 10, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 10, 2025

我用的叶秋整合包

@llm8047
Copy link
Author

llm8047 commented Jan 10, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 10, 2025

都有整合包 我就在启动器更新的

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 11, 2025

重装吧 能解决90%的问题 我现在已经正常用了 或者你可以试试先卸载 再重新装 别在原来的基础上更新

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 11, 2025

我全部都是最新版本

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 11, 2025

整合包 是1.4
你这种 我不清楚

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 11, 2025

看不见

@llm8047
Copy link
Author

llm8047 commented Jan 11, 2025 via email

@XMZovo
Copy link

XMZovo commented Jan 11, 2025

没有

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Potential Bug User is reporting a bug. This should be tested.
Projects
None yet
Development

No branches or pull requests

3 participants