Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improvements to chatbot implementation (allow chat follow ups and better vector search) #492

Open
alexbhandari opened this issue Nov 28, 2024 · 0 comments

Comments

@alexbhandari
Copy link

This feature aims to improve the chatbot by addressing the 2 issues below

  1. The chat feature does not seem to allow follow ups.
    When there is past history in the chat, this query: what did I ask you last? Returns: Unfortunately, since this is our first interaction, there is no previous conversation or question to recall.

  2. The chatbot vector search does not work very well especially with many notes.
    Query: Tell me about myself. Returns context specific only to a few notes that match the vector search. In this example I expect the LLM to be able to generate a better query to the vector search.
    Query: How many notes do I have? Returns: "There are 3 notes in the provided text." When I have hundreds of notes.

Describe the solution you'd like (split by issue 1 and 2)

  1. add chat history context - this was already added to the writing assistant last month but doesn't seem to be added for the chatbot

  2. Better integration of chatbot with vector search. Add in-built prompting to tell the model some context like it is in a notetaking app and give it a way to query the vector database for additional info.

Current implementation:
user query → vector store search → query + search results → LLM → response

Proposed:
user query → vector store search → query + search result → LLM → LLM generates vector search query (if needed) → additional vector store search (if needed) → query + new search result → LLM generates response

Note: In the proposed model the initial vector search context size passed should be reduced. Currently the chatbot gives more importance to the search results than the query.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant