Simple chat interface built in React and Vite to interact with AI models offline using Ollama.
- Select from locally downloaded models from Ollama
- Live response streaming
- Long term conversation memory saved to local storage
- Upload text, CSV, or JSON files, currently limited due to context size
- Ollama
- llama3.1 Model
- mistral Model
- llava Model - Open source vision model
- Ollama JavaScript Library
- Switch between different chats
- Render Markdown cleanly
- Clean up uploaded file displaying plaintext during chat
- Add image uploads for vision models