Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: DeepSeek reasoning_content stream response returns None always #7942

Open
MervinPraison opened this issue Jan 23, 2025 · 3 comments
Open
Labels
bug Something isn't working

Comments

@MervinPraison
Copy link

What happened?

Here is the example code.

DeepSeek reasoning_content stream response returns None always

without streaming i can see the reasoning_content working fine!

from litellm import completion

resp = completion(
    model="deepseek/deepseek-reasoner",
    messages=[{"role": "user", "content": "What is 1+1?"}],
    stream=True
)

for chunk in resp:
    print(chunk)

Relevant log output

ModelResponseStream(id='xxxxx-36be-4fcf-xxxxx-4d36ecaf7e0b', created=2345235, model='deepseek-reasoner', object='chat.completion.chunk', system_fingerprint='fp_1c5d8833bc', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={'reasoning_content': None}, refusal=None, content='The', role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None)], stream_options=None, citations=None)

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.59.5

Twitter / LinkedIn details

@MervinPraison

@MervinPraison MervinPraison added the bug Something isn't working label Jan 23, 2025
@MervinPraison
Copy link
Author

Even the litellm.stream_chunk_builder function is not returning any reasoning_content

from litellm import completion
import litellm

messages = [{"role": "user", "content": "What is 1+1?"}]
resp = completion(
    model="deepseek/deepseek-reasoner",
    messages=messages,
    stream=True
)

chunks = []
for chunk in resp:
    chunks.append(chunk)

print(litellm.stream_chunk_builder(chunks, messages=messages))

@maykcaldas
Copy link

+1. If not streaming, the provider_specific_fields is populated successfully. But with stream, it does not

@MarcoWel
Copy link

+1
Our setup is OpenwebUI->Litellm->Ollama, it is close to unusable with Deepseek-R1.
OpenwebUI->Ollama works perfectly fine including live streaming of reasoning content.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants