Skip to content

cuappdev/ai-dev

Folders and files

NameName
Last commit message
Last commit date

Latest commit

6ae3f86 · Jan 3, 2025
Dec 5, 2024
Dec 29, 2024
Dec 25, 2024
Dec 29, 2024
Jan 3, 2025
Dec 29, 2024
Dec 24, 2024
Dec 24, 2024
Dec 24, 2024
Dec 28, 2024
Dec 22, 2024
Dec 29, 2024
Dec 24, 2024
Dec 28, 2024
Dec 28, 2024
Nov 26, 2024
Dec 24, 2024
Nov 26, 2024

Repository files navigation

This is a Next.js project bootstrapped with create-next-app

Getting Started

First, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

This project uses next/font to automatically optimize and load Geist, a new font family for Vercel.

Learn More

To learn more about Next.js, take a look at the following resources:

You can check out the Next.js GitHub repository - your feedback and contributions are welcome!

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.

'https://0.0.0.0:3000'

curl -X POST "http://0.0.0.0:11434/api/pull" -d '{"model":"llama3.2:1b"}' curl -X POST "http://0.0.0.0:11434/api/generate" -d '{"model":"llama3.2:1b", "prompt":"hello"}'

-H "Content-Type: application/json" \

curl -X POST "http://localhost:3000/api/models"
--cookie ""
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Hello"}]}'

curl -X GET "https://ai.cornellappdev.com/api/models/all"

curl -H "Origin: https://ai.cornellappdev.com"
-H "Access-Control-Request-Method: POST"
-H "Access-Control-Request-Headers: X-Requested-With"
-X OPTIONS --verbose
https://ai.cornellappdev.com/api/models

curl -X POST "https://ai.cornellappdev.com/api/models"
-H "Content-Type: application/json"
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Hello"}]}'
--verbose

curl -X POST "https://ai.cornellappdev.com/api/models"
-H "Content-Type: application/json"
-H "Origin: https://ai.cornellappdev.com"
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Hello"}]}'
--verbose

docker exec -it e345672ce6b2 curl -X POST "http://0.0.0.0:11434/api/generate" -d '{"model":"llama3.2:1b", "prompt":"hello"}'

thestack_app-network

Pull Model - ollama docker run --rm --network thestack_app-network curlimages/curl
-X POST "http://ollama:11434/api/pull"
-d '{"model":"llama3.2:1b"}'

Get active models - ollama docker run --rm --network thestack_app-network curlimages/curl
-X GET "http://ollama:11434/api/ps"

Get all models - ollama docker run --rm --network thestack_app-network curlimages/curl
-X GET "http://ollama:11434/api/tags"

Generate - ollama docker run --rm --network thestack_app-network curlimages/curl
-X POST "http://ollama:11434/api/generate"
-d '{"model":"llama3.2:1b", "prompt":"hello"}'

Chat - ollama docker run --rm --network thestack_app-network curlimages/curl
-X POST "http://ollama:11434/api/chat"
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Why is the sky blue"}]}'

Get all models - app docker run --rm --network thestack_app-network curlimages/curl
-X GET "http://ai-dev-app:3000/api/models/all"

Get active models - app docker run --rm --network thestack_app-network curlimages/curl
-X GET "http://ai-dev-app:3000/api/models/active"

Chat - app docker run --rm --network thestack_app-network curlimages/curl
-X POST "http://ai-dev-app:3000/api/chat"
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Hello"}]}'

Chat Models - app docker run --rm --network thestack_app-network curlimages/curl
-X POST "http://ai-dev-app:3000/api/models"
-d '{"model":"llama3.2:1b", "messages": [{"role":"user", "content":"Hello"}]}'

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published