Skip to content

How to create pinecone embeddings offline for use by online bot. #3754

Answered by Hamas-ur-Rehman
svange asked this question in Q&A
Discussion options

You must be logged in to vote

I have done something similar like this using chromadb
you might need to install

pip install chroma
pip install chromadb
from langchain.vectorstores import Chroma

//define your document loader and splitters here and add the splitted documents into `texts` variable

docsearch = Chroma.from_texts([t.page_content for t in texts], embeddings, collection_name=collection_name , persist_directory=f'db/{collection_name}')
docsearch.persist()
docsearch = None

you would load the data and store it into your embeddings this way then what you can do is make an api which would feed this vector store but you want to load it and then feed it.

vectorstore = Chroma(collection_name=f'{collection_name}',p…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@svange
Comment options

Answer selected by svange
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants