Skip to content

Latest commit

 

History

History
59 lines (37 loc) · 1.91 KB

README.md

File metadata and controls

59 lines (37 loc) · 1.91 KB

LLama 3.2 3B AI Server implementation

This Project Is Under Development, Contributions Are Welcome.


To get the current env.template config (reach out to [email protected])

Server Stack

  • FastAPI for the Python backend API..
  • 🐋 Docker Compose for development and production.
  • 🔒 Secure password hashing by default.
  • 🔑 JWT (JSON Web Token) authentication.
  • 📫 Email based password recovery.
  • ✅ Tests with Pytest.
  • 📞 Traefik as a reverse proxy / load balancer.
  • 🚢 Deployment instructions using Docker Compose, including how to set up a frontend Traefik proxy to handle automatic HTTPS certificates.
  • 🏭 CI (continuous integration) and CD (continuous deployment) based on GitHub Actions.

API Documentation

Service API (provides endpoints to use the model)

LLM implementation / initialisation

Deployment Config

Docker is used see here important files:

Dockerfile Docker Config

Configure

You can then update configs in the .env files to customize your configurations.

Before deploying it, make sure you change at least the values for:

  • SECRET_KEY
  • FIRST_SUPERUSER_PASSWORD
  • POSTGRES_PASSWORD

You can (and should) pass these as environment variables from secrets.

Read the deployment.md docs for more details.

Backend Docs (not updated now: template config viewable)

Backend docs: backend/README.md.

Deployment

Deployment docs: deployment.md.