-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I get the following error: list index out of range #67
Comments
And there is one more error in logs: 2024-10-21 07:39:53 error: [OpenAIClient] Known OpenAI error: Error: missing role for choice 0 but when there is this error, LibreChat displays a message, but it would also be good to correct it |
This error is connected with moa: <optillm_approach>moa</optillm_approach> and error is connected with vllm as a model inference: 2024-10-21 08:12:42,987 - INFO - Received request to /v1/chat/completions |
Does vllm support returning multiple responses from the /v1/chat/completions end point? For Line 16 in 95cc14d
cot_reflection do you get the same error? Unfortunately, I cannot test vllm locally as I am on mac m3 and it doesn't support it (vllm-project/vllm#2081)
|
Yes, it returns. cot_reflection is ok |
Do you have the same problem (running moa) with another model? Or are you calling it with the right chat template?
|
I was sending too many different approaches at once (bon|moa|mcts|cot_reflection). So to sort it out. Problem with vllm and error (following error is in OptiLLM logs) Problem with what optillm returns to LibreChat (following error is in LibreChat logs) |
Direct connection from LibreChat to vLLM doesn't cause such an error |
This particular error looks like a known issue with LibreChat - danny-avila/LibreChat#1222 |
Ok, so only 'list index out of range' left. |
For that can please run I added more logging to help figure out where it is failing. |
Yeah, vllm is not returning 3 responses, can you get the 3 responses if you set |
Hi,
in LibreChat connected to optillm proxy served by docker I get following error:
2024-10-21 06:56:48 error: [handleAbortError] AI response error; aborting request: 500 "list index out of range"
prompt:
<optillm_approach>bon|moa|mcts|cot_reflection</optillm_approach>
There are two hippos in front of a some hippo, two hippos behind a some hippo and a some hippo in the middle. How many hippos are there?
and most other approaches.
It looks like optillm is returning messages that are not compliant with the OpenAI API standard.
The text was updated successfully, but these errors were encountered: