Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: customizable temperature #99

Open
hab25 opened this issue Apr 3, 2024 · 4 comments
Open

Feature request: customizable temperature #99

hab25 opened this issue Apr 3, 2024 · 4 comments
Labels
enhancement New feature or request

Comments

@hab25
Copy link

hab25 commented Apr 3, 2024

I like setting temperature (see https://github.com/ahyatt/llm/blob/a090d3bdbd8764c2d0ea6e157ad02569068a09cb/llm.el#L101C1-L104C22 ) to 0.

AFAICT this is not currently cleanly possible with ellama.

@s-kostyaev s-kostyaev added the enhancement New feature or request label Apr 3, 2024
@NightMachinery
Copy link

Is there a simple elisp hack to hardcode the temperature to zero?

@s-kostyaev
Copy link
Owner

You can create custom ollama model with temperature 0. See modelfile documentation in ollama repo.

@NightMachinery
Copy link

I don't use the ollama backend; I use Groq or OpenRouter, depending on the model. (Both are OpenAI compatible servers.)

@s-kostyaev
Copy link
Owner

No simple hacks, I will add support later. Llm library supports it, so I need to pass param there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants