Inferix is a wrapper on top of Ollama. It aims to expose an OpenAI compatible API with some extra goodies. It currently supports:
- OpenAI compatible RestAPI on top of Ollama.
- LLM powered function calling.
- Streaming response laterally.
- Maintaining conversation history like AutoGen on the server.
To install and run Inferix, follow these steps:
- Clone the repository:
git clone https://github.com/YourTechBud/inferix.git
- Navigate to the project directory:
cd inferix
- Install the dependencies:
poetry install
To start Inferix, run the following command:
poetry run start
- Open http://localhost:8000/docs to open swagger ui
- Open http://localhost:8000/openapi.json to access the raw openapi spec