Replies: 1 comment
-
Hey @Exemocaro! I'm here to help you out with any bugs, questions, or contributions you have for the project. Let's get that issue sorted out. It seems like you've encountered a common issue where PaperQA attempts to connect to OpenAI despite configuring it to use a local model. You mentioned that you resolved this by adding the For further reference, there are similar discussions on this topic, such as the Local LLM using Ollama without OpenAI thread, which is still open and might provide additional insights or solutions [1]. Your approach of adding a dummy API key and configuring the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
Hi, I wanted to use ollama locally with paperqa and the code provided in the main README is not working, as it tries connecting to OpenAI no matter what I do. I managed to fix it by adding the
agent=AgentSettings(agent_llm='ollama/llama3.2', agent_llm_config=local_llm_config),
line as discussed in this disccusion, but I also tried to add a dummy key and adding#"api_key": "sk-no-key-required",
to the litellm parameters.My environment looks like this:
python=3.12.3
My requirements: requirements.txt
My code:
Beta Was this translation helpful? Give feedback.
All reactions