-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] github o1模型不支持流式,请求适配非流输出 #5145
Comments
👀 @Zzz3626 Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
I found a similar discussion regarding the issue of models not supporting streaming in the lobe-chat framework. The discussion is titled "调用chat的AI接口,传入stream为false,还是调用流式数据,或者返回报错" and it remains unsolved [1]. Regarding your issue, the error message indicates that the model you are using does not support streaming ( To resolve this, you can set the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
#5110 似乎已经适配过了,但是仍然需要手动自定义添加o1模型(如下图) |
#5110 seems to have been adapted, but you still need to manually add the o1 model (as shown below) |
同样能复现该报错。 |
📦 部署环境
Docker
📌 软件版本
v1.36.46
💻 系统环境
Windows
🌐 浏览器
Edge
🐛 问题描述
{
"error": {
"code": "unsupported_value",
"type": "invalid_request_error",
"param": "stream",
"message": "Unsupported value: 'stream' does not support true with this model. Supported values are: false."
},
"endpoint": "https://models.inference.ai.azure.com/",
"provider": "github"
}
📷 复现步骤
No response
🚦 期望结果
No response
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: