You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
+1
Our setup is OpenwebUI->Litellm->Ollama, it is close to unusable with Deepseek-R1. OpenwebUI->Ollama works perfectly fine including live streaming of reasoning content.
What happened?
Here is the example code.
DeepSeek reasoning_content stream response returns None always
without streaming i can see the
reasoning_content
working fine!Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.59.5
Twitter / LinkedIn details
@MervinPraison
The text was updated successfully, but these errors were encountered: