Skip to content

Using chat instead of llm will not answer questions which chat can answer itself #4660

Answered by LouisSilva
sunlongjian asked this question in Q&A
Discussion options

You must be logged in to vote

The problem is that with the agents CHAT_ZERO_SHOT_REACT_DESCRIPTION and ZERO_SHOT_REACT_DESCRIPTION, if you were to ask it something that it cannot use a tool to answer, the OutputParser will raise an error because it expects the LLM to use a tool, but when you asked it how it was, it couldn't use its llm-math tool or searx-search tool to answer because it doesn't need to. If you run your code again, but ask it to do a maths problem (so it uses the maths tool), it will work.

If you want an agent that can use its tools and talk to you like a normal chat bot, then use the CHAT_CONVERSATIONAL_REACT_DESCRIPTION agent. The link to the documentation is here. All you will need to do in your cod…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@sunlongjian
Comment options

Answer selected by sunlongjian
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants