Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add local models to non-streaming accept list #14420

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

MatthewKhouzam
Copy link

@MatthewKhouzam MatthewKhouzam commented Nov 8, 2024

What it does

Allows local model orchestrators like GPT4All to be a back-end for Theia's AI features by allowing to configure disableStreaming for them

Starts to address issues/14413

How to test

  • Configure a custom Open AI model
  • Add disableStreaming: true
  • Observe (or debug) that the non-streaming API is used

Example Open AI configuration (using the official Open AI endpoint):

        {
            "model": "gpt-4o",
            "id": "my-custom-gpt-4o",
            "url": "https://api.openai.com/v1",
            "apiKey": true,
            "disableStreaming": true
        }

Follow-ups

We need to set up a max_token as different orchestrators stop at different lengths.

Review checklist

Reminder for reviewers

Copy link
Member

@sdirix sdirix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this hotfix.

In general we should make this configurable per (custom) model with an additional attribute.

@JonasHelming
Copy link
Contributor

It is not only "custom" models, o1-preview also does not support streaming at the moment.
In general it probably makes sense to allows custom parameters per model on a global level.

@sdirix
Copy link
Member

sdirix commented Nov 11, 2024

Middleground suggestion to get this in quickly:

  • For now we can manually keep the hard coded non-streaming list of Open AI models as we maintain them manually anyway. This keeps the nicer UI for Open AI models.
  • For all custom models we add a disableStreaming attribute which is false by default

Custom OpenAI models can now be configured with
'disableStreaming: true' to indicate that streaming shall not be used.
This is especially useful for models which do not support streaming at
all.

Co-authored-by: Matthew Khouzam <[email protected]>
@sdirix
Copy link
Member

sdirix commented Nov 12, 2024

Adapted the PR. @MatthewKhouzam can you check whether this works for you?

@sdirix
Copy link
Member

sdirix commented Nov 15, 2024

@MatthewKhouzam Did you have a chance to look the changes? Can we merge?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Waiting on reviewers
Development

Successfully merging this pull request may close these issues.

3 participants