You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be great to add support for an open-source LLM or get help integrating one, for example with LlamaIndex and Ollama. I could potentially do it myself, but I'm not yet sure if I can replace the Marvin OpenAI dependency with something like LlamaIndex's structured output parser. I also don't know if LlamaIndex can fully replicate Marvin's capabilities for this project. Could you assist with this.
Thanks in advance
The text was updated successfully, but these errors were encountered:
Sure, I'd be happy to help. Marvin is exceptionally good at the structured extract, but LangChain or LlamaIndex have structured parsers that should work.
Switching to an open source LLM is pretty easy in LlamaIndex. The bigger challenge will be where to host it. We could host it in a container in the compose stack, but it'll likely have huge memory requirements. Better bet is to use HF's inference endpoints. You can also see some work another user did on integrating Olama here.
It would be great to add support for an open-source LLM or get help integrating one, for example with LlamaIndex and Ollama. I could potentially do it myself, but I'm not yet sure if I can replace the Marvin OpenAI dependency with something like LlamaIndex's structured output parser. I also don't know if LlamaIndex can fully replicate Marvin's capabilities for this project. Could you assist with this.
Thanks in advance
The text was updated successfully, but these errors were encountered: