Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Error while use deepseek R1 model #6443

Closed
1 task done
Lark-Base opened this issue Jan 24, 2025 · 4 comments
Closed
1 task done

[Bug]: Error while use deepseek R1 model #6443

Lark-Base opened this issue Jan 24, 2025 · 4 comments
Labels
bug Something isn't working

Comments

@Lark-Base
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

BadRequestError: litellm.BadRequestError: DeepseekException - Error code: 400 - {'error': {'message': 'deepseek-reasoner does not support successive user or assistant messages (messages[1] and messages[2] in your input). You should interleave the user/assistant messages in the message sequence.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}

OpenHands Installation

Docker command in README

OpenHands Version

0.21

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

@Lark-Base
Copy link
Author

it works now, add 'AGENT_ENABLE_PROMPT_EXTENSIONS=0'

docker run -it --rm --pull=always
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.21-nikolaik
-e LOG_ALL_EVENTS=true
-e AGENT_ENABLE_PROMPT_EXTENSIONS=0
-v /var/run/docker.sock:/var/run/docker.sock
-v ~/.openhands-state:/.openhands-state
-p 3000:3000
--add-host host.docker.internal:host-gateway
--name openhands-app
docker.all-hands.dev/all-hands-ai/openhands:0.21

@enyst
Copy link
Collaborator

enyst commented Jan 24, 2025

Thank you for the report and follow-up! I think it's something that litellm might fix, they are merging messages for some other LLMs too.

You were using the official API? @Lark-Base

@gdias1992
Copy link

Thank you for the report and follow-up! I think it's something that litellm might fix, they are merging messages for some other LLMs too.

You were using the official API? @Lark-Base

I am using the official API and got the same problem.

@j0yk1ll
Copy link

j0yk1ll commented Jan 26, 2025

@gdias1992 The issue is, that deepseek returns an additional reasoning message i.e. user -> reasoning -> ai -> user -> reasoning -> ai (it could also be just additional field on the AI message, but the theory is still the same)

but the underlying libraries don't support that yet, they still expect user -> ai -> user -> ai

As a result it gives an error. That's why @Lark-Base suggested the AGENT_ENABLE_PROMPT_EXTENSIONS=0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants