Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/Enhancement Add or Help to add Open Source LLM support #177

Open
JPC612 opened this issue Aug 7, 2024 · 1 comment
Open

Feature/Enhancement Add or Help to add Open Source LLM support #177

JPC612 opened this issue Aug 7, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@JPC612
Copy link

JPC612 commented Aug 7, 2024

It would be great to add support for an open-source LLM or get help integrating one, for example with LlamaIndex and Ollama. I could potentially do it myself, but I'm not yet sure if I can replace the Marvin OpenAI dependency with something like LlamaIndex's structured output parser. I also don't know if LlamaIndex can fully replicate Marvin's capabilities for this project. Could you assist with this.

Thanks in advance

@JSv4
Copy link
Owner

JSv4 commented Aug 8, 2024

Sure, I'd be happy to help. Marvin is exceptionally good at the structured extract, but LangChain or LlamaIndex have structured parsers that should work.

Switching to an open source LLM is pretty easy in LlamaIndex. The bigger challenge will be where to host it. We could host it in a container in the compose stack, but it'll likely have huge memory requirements. Better bet is to use HF's inference endpoints. You can also see some work another user did on integrating Olama here.

@JSv4 JSv4 added the enhancement New feature or request label Aug 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants