-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModelAuthProvider restricted to @Blocking #1214
Comments
We probably want to provide both a sync and async version instead of just dropping the sync one completely... |
this would mean 2 interfaces by the way there seems to be another approach in gemini to work around the issue: |
Probably just a single interface with one async and one sync method
Right, I need to look at this in a more systematic way |
Sure, it will likely be best if having 2 methods can be avoided, I recall Quarkus REST or REST Client can also decide if it is blocking given the response type of the filter |
@geoand so you guys take over from here - is there anything you expect from me? |
Sure yeah |
the next step in context of "azure-openai" would the be to provide a default-implementation somehow like https://github.com/flyinfish/quarkus-langchain4j/blob/942c8306e43ff3e82e7ef2ea3f61769e1da1de18/model-providers/vertex-ai-gemini/runtime/src/main/java/io/quarkiverse/langchain4j/vertexai/runtime/gemini/VertxAiGeminiRestApi.java#L143 does. but one step after the other |
@sberyozkin do you plan to look into this? |
ISSUE
actually i am not 100% shure if bug,issue or i just missed something...
i am using
ModelAuthProvider
as described in https://docs.quarkiverse.io/quarkus-langchain4j/dev/openai.html#_using_authprovider.everything works fine except that it forces me to use
@Blocking
when using SSEExpected Behaviour
wouldn't it be nice to use non-blocking auth which at least azure would provide
Workaround
using blocking communication for whole llm-call
Reproduce
The text was updated successfully, but these errors were encountered: