Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] remove temperature param in case of reasoning models like o1 and o3 #914

Open
carlosveloso opened this issue Feb 3, 2025 · 0 comments
Labels
enhancement New feature or request triage

Comments

@carlosveloso
Copy link

What Would You Like to See with the Gateway?

Some libraries like Vercel AI send a default temperature: 0 value, that breaks the calls to reasoning models of OpenAI, currently there is no way of remove that. Is posible to ignore it or remove that param in the gateway?

Context for your Request

No response

Your Twitter/LinkedIn

No response

@carlosveloso carlosveloso added the enhancement New feature or request label Feb 3, 2025
@github-actions github-actions bot added the triage label Feb 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage
Projects
None yet
Development

No branches or pull requests

1 participant