Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

使用vulkan后端进行qwen1.5b的推理时,报Don't support type [Attention], Attention/Reshape_8_output_0错误 #224

Open
RisingEntropy opened this issue Oct 24, 2024 · 0 comments

Comments

@RisingEntropy
Copy link

我在windows上用2.9.6版本的MNN进行qwen 1.5b的推理,后端选择vulkan时报错如下,使用cpu后端一切正常:

The device supports: i8sdot:0, fp16:0, i8mm: 0, sve2: 0
Can't open file:.tempcache
Load Cache file error.
### is_single_ = 1
load tokenizer
tokenizer_type = 3
load tokenizer Done
load D:\llama2\model\llm.mnn ... Don't support type [Attention], Attention/Reshape_8_output_0
Don't support type [Attention], Attention/Reshape_8_output_0
Don't support type [Attention], Attention/Reshape_17_output_0
Don't support type [Attention], Attention/Reshape_17_output_0
Don't support type [Attention], Attention/Reshape_26_output_0
Don't support type [Attention], Attention/Reshape_26_output_0
Don't support type [Attention], Attention/Reshape_35_output_0
Don't support type [Attention], Attention/Reshape_35_output_0
Don't support type [Attention], Attention/Reshape_44_output_0
Don't support type [Attention], Attention/Reshape_44_output_0
Don't support type [Attention], Attention/Reshape_53_output_0
Don't support type [Attention], Attention/Reshape_53_output_0
Don't support type [Attention], Attention/Reshape_62_output_0
Don't support type [Attention], Attention/Reshape_62_output_0
Don't support type [Attention], Attention/Reshape_71_output_0
Don't support type [Attention], Attention/Reshape_71_output_0
Don't support type [Attention], Attention/Reshape_80_output_0
Don't support type [Attention], Attention/Reshape_80_output_0
Don't support type [Attention], Attention/Reshape_89_output_0
Don't support type [Attention], Attention/Reshape_89_output_0
Don't support type [Attention], Attention/Reshape_98_output_0
Don't support type [Attention], Attention/Reshape_98_output_0
Don't support type [Attention], Attention/Reshape_107_output_0
Don't support type [Attention], Attention/Reshape_107_output_0
Don't support type [Attention], Attention/Reshape_116_output_0
Don't support type [Attention], Attention/Reshape_116_output_0
Don't support type [Attention], Attention/Reshape_125_output_0
Don't support type [Attention], Attention/Reshape_125_output_0
Don't support type [Attention], Attention/Reshape_134_output_0
Don't support type [Attention], Attention/Reshape_134_output_0
Don't support type [Attention], Attention/Reshape_143_output_0
Don't support type [Attention], Attention/Reshape_143_output_0
Don't support type [Attention], Attention/Reshape_152_output_0
Don't support type [Attention], Attention/Reshape_152_output_0
Don't support type [Attention], Attention/Reshape_161_output_0
Don't support type [Attention], Attention/Reshape_161_output_0
Don't support type [Attention], Attention/Reshape_170_output_0
Don't support type [Attention], Attention/Reshape_170_output_0
Don't support type [Attention], Attention/Reshape_179_output_0
Don't support type [Attention], Attention/Reshape_179_output_0
Don't support type [Attention], Attention/Reshape_188_output_0
Don't support type [Attention], Attention/Reshape_188_output_0
Don't support type [Attention], Attention/Reshape_197_output_0
Don't support type [Attention], Attention/Reshape_197_output_0
Don't support type [Attention], Attention/Reshape_206_output_0
Don't support type [Attention], Attention/Reshape_206_output_0
Don't support type [Attention], Attention/Reshape_215_output_0
Don't support type [Attention], Attention/Reshape_215_output_0
Don't support type [Attention], Attention/Reshape_224_output_0
Don't support type [Attention], Attention/Reshape_224_output_0
Don't support type [Attention], Attention/Reshape_233_output_0
Don't support type [Attention], Attention/Reshape_233_output_0
Don't support type [Attention], Attention/Reshape_242_output_0
Don't support type [Attention], Attention/Reshape_242_output_0
Don't support type [Attention], Attention/Reshape_251_output_0
Don't support type [Attention], Attention/Reshape_251_output_0
Load Module Done!
Clone Decode Module Done!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant