-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Error while use deepseek R1 model #6443
Comments
it works now, add 'AGENT_ENABLE_PROMPT_EXTENSIONS=0' docker run -it --rm --pull=always |
Thank you for the report and follow-up! I think it's something that litellm might fix, they are merging messages for some other LLMs too. You were using the official API? @Lark-Base |
I am using the official API and got the same problem. |
@gdias1992 The issue is, that deepseek returns an additional reasoning message i.e. user -> reasoning -> ai -> user -> reasoning -> ai (it could also be just additional field on the AI message, but the theory is still the same) but the underlying libraries don't support that yet, they still expect user -> ai -> user -> ai As a result it gives an error. That's why @Lark-Base suggested the AGENT_ENABLE_PROMPT_EXTENSIONS=0 |
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
BadRequestError: litellm.BadRequestError: DeepseekException - Error code: 400 - {'error': {'message': 'deepseek-reasoner does not support successive user or assistant messages (messages[1] and messages[2] in your input). You should interleave the user/assistant messages in the message sequence.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
OpenHands Installation
Docker command in README
OpenHands Version
0.21
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: