Replies: 1 comment
-
I think this is answering your question? For each API request to Gemini using Tools, the context will be retrieved from the Tool (Vertex AI Search or Google Search) and the context will be fed into the Gemini prompt to answer the requests. So the temperature should still affect the response. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Question on grounding, section Google Search and Vertex AI Search (https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/getting-started/intro_gemini_2_0_flash.ipynb).
If I provide some relevant documents in a data store, and specify tools
Tool(retrieval=Retrieval(vertex_ai_search=....)
that would retrieve relevant contents and then, allow my loaded model to generate response based on retrieved contents, that would be subject to the temperature setting - assuming my documents have several perspectives.But if I use Google Search,
Tool(google_search=GoogleSearch())
, Will the response be generated byGoogleSearch
module, not affected by the model loaded in my code?Beta Was this translation helpful? Give feedback.
All reactions