Real Time Voice Assistant using LLama 3, Deepgram, and AWS Polly.
- Python 3
- Redis
- Install llama-cpp-python according to the instructions in the llama-cpp-python documentation.
- Install requirements
$ pip install -r requirements.txt
- The following environment variables are required:
AWS_ACCESS_KEY=<Your AWS Access Key>
AWS_SECRET_KEY=<Your AWS Secret Key>
DEEPGRAM_API_KEY=<Your Deepgram API Key>
- (Optional) Customize your settings in
settings.py
. You may want to customize the language, by default it is set toes-ES
.
- Run the worker
$ celery -A tasks worker --loglevel=info
- Run the app
$ python app.py
Right now, the assistant is configured to speak in Spanish, you can change the language in settings.py
.
It's recommended to use the solo
pool for the worker on Mac OS. This is because the prefork
pool is not supported on Mac OS. To run the worker with the solo
pool, use the following command:
$ celery -A tasks worker --loglevel=info --pool=solo