AIPT is a simple app that checks how each model suits specified prompt.
Here's a video of it in action:
AIPT.mp4
The app is really simple.
You give it a prompt and desired output.
It then gives that prompt to every model and compares the output.
Note
Running process shouldn't be different on different OS.
Here are the requirements. You have to have:
- python3
- ollama (must also be running)
- git
-
Clone this repo
git clone https://github.com/NOTMEE12/AIPT cd AIPT
-
Install required libraries.
pip install -r requirements.txt
-
Run the app
python main.py # arguments: # --HOST - ip on which the app will run # --PORT - port # --DEBUG - run the app with debug turned on? # --OLLAMA_HOST - Ollama host (with port) eg.: http://127.0.0.1:11434
Run:
docker run --name AIPT -p 11432:11432 -e DEBUG=False -e PORT=11432 -h host.docker.internal:host-gateway notmee12/aipt python main.py --OLLAMA_HOST="http://host.docker.internal:11434" --HOST="0.0.0.0"
If you have ollama running on other url then replace the --OLLAMA_HOST=...
- replace the ...
with url of ollama.
Important
If you are using docker desktop, you have run this command in terminal. (Remember to have docker desktop active)
- git (ofc)
Running process shouldn't be different on different OS.
- Clone this repo
git clone https://github.com/NOTMEE12/AIPT
cd AIPT
- Install required libraries.
pip install flask flask_sock ollama
- Run the app
python main.py
-
Q: How to run this tool with ollama from different device/url?
-
A: To run this tool with ollama from different device/url, you need to either:
- pass an
--OLLAMA_HOST
argument to the run command, eg:python main.py --OLLAMA_HOST=...
(replace...
with the url of the ollama). - set an environment variable
OLLAMA_HOST
to the ollama host.
- pass an
-
Q: Can I use X model which is not on ollama?
-
A: AIPT currently only supports ollama models.
-
Q: Can I exclude X model from being tested?
-
A: Not yet.