You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
0.15.2
Cloud or Self Hosted
Cloud
Steps to reproduce
In chatbot mode, the Deepseek API created from Azure AI Foundry works perfectly fine. However, after switching to chatflow mode, an error occurs in the second newly added LLM block.
The error code is as follows:
Run failed: 1 validation error for LLMNodeData
model.mode
Field required [type=missing, input_value={'provider': 'azure_ai_st... 1, 'max_tokens': 50}}, input_type=dict]
After exporting the DSL file, I noticed that in the automatically converted chatflow DSL, the LLM block contains:
mode: completion
However, this attribute does not exist in manually created LLM blocks (even those created from a blank template).
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered:
Hey @kelunyang! I'm here to help you with your bug report and any questions you have. I can also assist you in becoming a contributor.
The issue you're encountering seems to be related to the mode attribute in the LLM block within the chatflow DSL. The error message indicates that the model.mode field is missing, which is required for the LLMNodeData object. In chatflow mode, the mode attribute is crucial as it specifies the operational mode of the language model. When set to "chat," it enables tool calling by invoking assistant tools, whereas "completion" mode focuses on generating text completions without invoking tools [1].
To resolve this issue, ensure that the mode attribute is correctly set in your chatflow DSL. If the mode is set to "completion" and you need tool calling capabilities, consider changing it to "chat." Additionally, verify that all required fields for the LLMNodeData object are properly configured, as missing fields can lead to validation errors [2].
Self Checks
Dify version
0.15.2
Cloud or Self Hosted
Cloud
Steps to reproduce
In chatbot mode, the Deepseek API created from Azure AI Foundry works perfectly fine. However, after switching to chatflow mode, an error occurs in the second newly added LLM block.
The error code is as follows:
After exporting the DSL file, I noticed that in the automatically converted chatflow DSL, the LLM block contains:
mode: completion
However, this attribute does not exist in manually created LLM blocks (even those created from a blank template).
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: