v0.14.6 - Ollama & News Updates
Chatbot Updates
- Expand
/news/
RAG command to include reference URL links in news article headlines. - Add response statistics (number of tokens and tokens per second) to footer.
- Serve up local copy of socket.io.js library to help with air-gap installations.
Ollama Support
- Add logic to chatbot to support OpenAI API servers that do not support the
/v1/models
API. This allows the Chatbot to work with Ollama provided the user specifies theLLM_MODEL
, example docker run script:
docker run \
-d \
-p 5000:5000 \
-e PORT=5000 \
-e OPENAI_API_KEY="Asimov-3-Laws" \
-e OPENAI_API_BASE="http://localhost:11434/v1" \
-e LLM_MODEL="llama3" \
-e USE_SYSTEM="false" \
-e MAXTOKENS=4096 \
-e TZ="America/Los_Angeles" \
-v $PWD/.tinyllm:/app/.tinyllm \
--name chatbot \
--restart unless-stopped \
jasonacox/chatbot
Full Changelog: v0.14.4...v0.14.6