We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I like setting temperature (see https://github.com/ahyatt/llm/blob/a090d3bdbd8764c2d0ea6e157ad02569068a09cb/llm.el#L101C1-L104C22 ) to 0.
AFAICT this is not currently cleanly possible with ellama.
ellama
The text was updated successfully, but these errors were encountered:
Is there a simple elisp hack to hardcode the temperature to zero?
Sorry, something went wrong.
You can create custom ollama model with temperature 0. See modelfile documentation in ollama repo.
I don't use the ollama backend; I use Groq or OpenRouter, depending on the model. (Both are OpenAI compatible servers.)
No simple hacks, I will add support later. Llm library supports it, so I need to pass param there.
No branches or pull requests
I like setting temperature (see https://github.com/ahyatt/llm/blob/a090d3bdbd8764c2d0ea6e157ad02569068a09cb/llm.el#L101C1-L104C22 ) to 0.
AFAICT this is not currently cleanly possible with
ellama
.The text was updated successfully, but these errors were encountered: