Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better support for several different models #161

Open
vslavkin opened this issue Oct 8, 2024 · 0 comments
Open

Better support for several different models #161

vslavkin opened this issue Oct 8, 2024 · 0 comments

Comments

@vslavkin
Copy link

vslavkin commented Oct 8, 2024

Hello, sorry if the title isn't the best, I just couldn't think of something better.
The problem I have is that I'm trying out several models, and each one works best with a different prompt. Especially those that are specifically made for a task, like *-code models. Therefore, templates should be way more versatile and customizable.

I was thinking that it would be great if, when defining a new provider, one could set the templates. Something like

(setopt ellama-providers
	'(("zephyr" . (make-llm-ollama
		:chat-model "zephyr:7b-beta-q6_K"
		:embedding-model "zephyr:7b-beta-q6_K"))
		...
		:templates
		((code-completion . "Complete the following code in a single markdown block: %s")
		(etc...))))

This + what was discussed on #110 + maybe a system to detect whether a model in installed or not would make ellama way better to set up (probably this part should be another issue)
I can contribute, but I'm still getting the FSF copyright assignment (+ I'm not the best elisper out there)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant