You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to configure self-hosted LLM backends with Kong. I deployed Kong with the Gateway Operator.
I see the AIGateway K8s custom resource has .spec.largeLanguageModels.cloudHosted as a field. The description states
This is currently a required field, requiring at least one cloud-hosted LLM be specified, however in future iterations we may add other hosting options such as self-hosted LLMs as separate fields.
Are there any plans to add self-hosted LLM options to AIGateway? Are there other ways to configure self-hosted LLM backends with Kong deployed with Gateway Operator?
Acceptance Criteria
I can configure self-hosted LLM backends with the AIGateway CR
The text was updated successfully, but these errors were encountered:
We don't have plans at the moment to support self hosted LLMs in AIGateway but this is definitely something we could consider.
Are there other ways to configure self-hosted LLM backends with Kong deployed with Gateway Operator?
You could leverage the DataPlane CRD and try configuring all the plugins and environment variables on that. That has not been tested though and your mileage may vary.
Problem Statement
I want to configure self-hosted LLM backends with Kong. I deployed Kong with the Gateway Operator.
I see the
AIGateway
K8s custom resource has.spec.largeLanguageModels.cloudHosted
as a field. The description statesAre there any plans to add self-hosted LLM options to
AIGateway
? Are there other ways to configure self-hosted LLM backends with Kong deployed with Gateway Operator?Acceptance Criteria
AIGateway
CRThe text was updated successfully, but these errors were encountered: