You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The device supports: i8sdot:0, fp16:0, i8mm: 0, sve2: 0
Can't open file:.tempcache
Load Cache file error.
### is_single_ = 1
load tokenizer
tokenizer_type = 3
load tokenizer Done
load D:\llama2\model\llm.mnn ... Don't support type [Attention], Attention/Reshape_8_output_0
Don't support type [Attention], Attention/Reshape_8_output_0
Don't support type [Attention], Attention/Reshape_17_output_0
Don't support type [Attention], Attention/Reshape_17_output_0
Don't support type [Attention], Attention/Reshape_26_output_0
Don't support type [Attention], Attention/Reshape_26_output_0
Don't support type [Attention], Attention/Reshape_35_output_0
Don't support type [Attention], Attention/Reshape_35_output_0
Don't support type [Attention], Attention/Reshape_44_output_0
Don't support type [Attention], Attention/Reshape_44_output_0
Don't support type [Attention], Attention/Reshape_53_output_0
Don't support type [Attention], Attention/Reshape_53_output_0
Don't support type [Attention], Attention/Reshape_62_output_0
Don't support type [Attention], Attention/Reshape_62_output_0
Don't support type [Attention], Attention/Reshape_71_output_0
Don't support type [Attention], Attention/Reshape_71_output_0
Don't support type [Attention], Attention/Reshape_80_output_0
Don't support type [Attention], Attention/Reshape_80_output_0
Don't support type [Attention], Attention/Reshape_89_output_0
Don't support type [Attention], Attention/Reshape_89_output_0
Don't support type [Attention], Attention/Reshape_98_output_0
Don't support type [Attention], Attention/Reshape_98_output_0
Don't support type [Attention], Attention/Reshape_107_output_0
Don't support type [Attention], Attention/Reshape_107_output_0
Don't support type [Attention], Attention/Reshape_116_output_0
Don't support type [Attention], Attention/Reshape_116_output_0
Don't support type [Attention], Attention/Reshape_125_output_0
Don't support type [Attention], Attention/Reshape_125_output_0
Don't support type [Attention], Attention/Reshape_134_output_0
Don't support type [Attention], Attention/Reshape_134_output_0
Don't support type [Attention], Attention/Reshape_143_output_0
Don't support type [Attention], Attention/Reshape_143_output_0
Don't support type [Attention], Attention/Reshape_152_output_0
Don't support type [Attention], Attention/Reshape_152_output_0
Don't support type [Attention], Attention/Reshape_161_output_0
Don't support type [Attention], Attention/Reshape_161_output_0
Don't support type [Attention], Attention/Reshape_170_output_0
Don't support type [Attention], Attention/Reshape_170_output_0
Don't support type [Attention], Attention/Reshape_179_output_0
Don't support type [Attention], Attention/Reshape_179_output_0
Don't support type [Attention], Attention/Reshape_188_output_0
Don't support type [Attention], Attention/Reshape_188_output_0
Don't support type [Attention], Attention/Reshape_197_output_0
Don't support type [Attention], Attention/Reshape_197_output_0
Don't support type [Attention], Attention/Reshape_206_output_0
Don't support type [Attention], Attention/Reshape_206_output_0
Don't support type [Attention], Attention/Reshape_215_output_0
Don't support type [Attention], Attention/Reshape_215_output_0
Don't support type [Attention], Attention/Reshape_224_output_0
Don't support type [Attention], Attention/Reshape_224_output_0
Don't support type [Attention], Attention/Reshape_233_output_0
Don't support type [Attention], Attention/Reshape_233_output_0
Don't support type [Attention], Attention/Reshape_242_output_0
Don't support type [Attention], Attention/Reshape_242_output_0
Don't support type [Attention], Attention/Reshape_251_output_0
Don't support type [Attention], Attention/Reshape_251_output_0
Load Module Done!
Clone Decode Module Done!
The text was updated successfully, but these errors were encountered:
我在windows上用2.9.6版本的MNN进行qwen 1.5b的推理,后端选择vulkan时报错如下,使用cpu后端一切正常:
The text was updated successfully, but these errors were encountered: