Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ToolCall issue in LM Studio - Model : Llama 3.1 #75

Open
Vikneshkumarmohan opened this issue Sep 16, 2024 · 5 comments
Open

ToolCall issue in LM Studio - Model : Llama 3.1 #75

Vikneshkumarmohan opened this issue Sep 16, 2024 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@Vikneshkumarmohan
Copy link

ToolCall is not generating from the response of llama 3.1 model from LM Studio, when using langchain framework connecting through ChatOpenAI ,
Same Tool call is working fine with ollama for the same llama 3.1 model ,

as per langchain team , the response was not proper from the model , the tool call is working as expected with ollama ,

you can also refer this issue #26342 in langchain-ai

below is the code snippet

`

from typing import #Annotated

from langchain_anthropic import ChatAnthropic
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_core.messages import BaseMessage
from typing_extensions import TypedDict

from langgraph.graph import StateGraph
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
import os
from langchain_openai import ChatOpenAI
from langgraph.graph import END, START, StateGraph

Getting the Env value
TAVILY_API_KEY = os.getenv("TAVILY_API_KEY")

Point to the local server
llm = ChatOpenAI(base_url="http://localhost:1234/v1", api_key="lm-studio")

class State(TypedDict):
messages: Annotated[list, add_messages]

graph_builder = StateGraph(State)

tool = TavilySearchResults(max_results=2,include_answer="true")
tools = [tool]

llm_with_tools = llm.bind_tools(tools)

def chatbot(state: State):
print(State)
return {"messages": [llm_with_tools.invoke(state["messages"])]}

graph_builder.add_node("chatbot", chatbot)

tool_node = ToolNode(tools=[tool])
graph_builder.add_node("tools", tool_node)

graph_builder.add_conditional_edges(
"chatbot",
tools_condition,
)

Any time a tool is called, we return to the chatbot to decide the next step
graph_builder.add_edge("tools", "chatbot")
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile()

from langchain_core.messages import BaseMessage

while True:
user_input = input("User: ")
if user_input.lower() in ["quit", "exit", "q"]:
print("Goodbye!")
break
for event in graph.stream({"messages": [("user", user_input)]}):
for value in event.values():
if isinstance(value["messages"][-1], BaseMessage):
print("Assistant:", value["messages"][-1].content)`

@whogben
Copy link

whogben commented Oct 8, 2024

As far as I know LMStudio does not yet support tool calls, but it will be fantastic when it does! There's a small chance I'm wrong and someone will correct me, I just couldn't find any reference to tool calling in the docs and the tools parameter is not listed on the server chat completions endpoint. If not, gotta stick to ollama and other alternatives when using anything that relies on tool calls.

@hansvdam
Copy link

No It does not seem to support it unfortunately.

@yagil
Copy link
Member

yagil commented Oct 26, 2024

We are working on this

@yagil yagil added the enhancement New feature or request label Oct 26, 2024
@yagil
Copy link
Member

yagil commented Nov 5, 2024

We're about to start a beta for this. If you're interested, please fill out this google form: https://docs.google.com/forms/d/e/1FAIpQLSfqpRunKgv2ui4_CWyeQ88gFtOG6CEqFSbpRrHWgJjs2mIodw/viewform?usp=sf_link

@mihai-satmarean
Copy link

First thank you for your great work!

when using thew tool calling function with GPTscript the LM Studio with the model meta-llama-3.1-8b-instruct returns this:

[ERROR] 
Error rendering prompt with jinja template: Error: Cannot put tools in the first user message when there's no first user message!. Error Data: n/a, Additional Data: n/a

any hint is appreciated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants