vertexaigeminichatgenerator.mdx
1 --- 2 title: "VertexAIGeminiChatGenerator" 3 id: vertexaigeminichatgenerator 4 slug: "/vertexaigeminichatgenerator" 5 description: "`VertexAIGeminiChatGenerator` enables chat completion using Google Gemini models." 6 --- 7 8 # VertexAIGeminiChatGenerator 9 10 `VertexAIGeminiChatGenerator` enables chat completion using Google Gemini models. 11 12 :::warning[Deprecation Notice] 13 14 This integration uses the deprecated google-generativeai SDK, which will lose support after August 2025. 15 16 We recommend switching to the new [GoogleGenAIChatGenerator](googlegenaichatgenerator.mdx) integration instead. 17 ::: 18 19 <div className="key-value-table"> 20 21 | | | 22 | --- | --- | 23 | **Most common position in a pipeline** | After aΒ [ChatPromptBuilder](../builders/chatpromptbuilder.mdx) | 24 | **Mandatory run variables** | `messages`: A list of [`ChatMessage`](../../concepts/data-classes/chatmessage.mdx) objects representing the chat | 25 | **Output variables** | `replies`: A list of alternative replies of the model to the input chat | 26 | **API reference** | [Google Vertex](/reference/integrations-google-vertex) | 27 | **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/google_vertex | 28 29 </div> 30 31 `VertexAIGeminiGenerator` supports `gemini-1.5-pro` and `gemini-1.5-flash`/ `gemini-2.0-flash` models. Note that [Google recommends upgrading](https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versions) from `gemini-1.5-pro` to `gemini-2.0-flash`. 32 33 For available models, see https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models. 34 35 :::info 36 To explore the full capabilities of Gemini check out this [article](https://haystack.deepset.ai/blog/gemini-models-with-google-vertex-for-haystack) and the related [π§βπ³ Cookbook](https://colab.research.google.com/github/deepset-ai/haystack-cookbook/blob/main/notebooks/vertexai-gemini-examples.ipynb). 37 ::: 38 39 ### Parameters Overview 40 41 `VertexAIGeminiChatGenerator` uses Google Cloud Application Default Credentials (ADCs) for authentication. For more information on how to set up ADCs, see the [official documentation](https://cloud.google.com/docs/authentication/provide-credentials-adc). 42 43 Keep in mind that itβs essential to use an account that has access to a project authorized to use Google Vertex AI endpoints. 44 45 You can find your project ID in the [GCP resource manager](https://console.cloud.google.com/cloud-resource-manager) or locally by running `gcloud projects list` in your terminal. For more info on the gcloud CLI, see its [official documentation](https://cloud.google.com/cli). 46 47 ### Streaming 48 49 This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter. 50 51 ## Usage 52 53 You need to install the `google-vertex-haystack` package to use the `VertexAIGeminiChatGenerator`: 54 55 ```shell 56 pip install google-vertex-haystack 57 ``` 58 59 ### On its own 60 61 Basic usage: 62 63 ```python 64 from haystack.dataclasses import ChatMessage 65 from haystack_integrations.components.generators.google_vertex import VertexAIGeminiChatGenerator 66 67 gemini_chat = VertexAIGeminiChatGenerator() 68 69 messages = [ChatMessage.from_user("Tell me the name of a movie")] 70 res = gemini_chat.run(messages) 71 72 print(res["replies"][0].text) 73 >>> The Shawshank Redemption 74 75 messages += [res["replies"][0], ChatMessage.from_user("Who's the main actor?")] 76 res = gemini_chat.run(messages) 77 78 print(res["replies"][0].text) 79 >>> Tim Robbins 80 ``` 81 82 When chatting with Gemini Pro, you can also easily use function calls. First, define the function locally and convert into a [Tool](../../tools/tool.mdx): 83 84 ```python 85 from typing import Annotated 86 from haystack.tools import create_tool_from_function 87 88 89 ## example function to get the current weather 90 def get_current_weather( 91 location: Annotated[ 92 str, 93 "The city for which to get the weather, e.g. 'San Francisco'", 94 ] = "Munich", 95 unit: Annotated[str, "The unit for the temperature, e.g. 'celsius'"] = "celsius", 96 ) -> str: 97 return f"The weather in {location} is sunny. The temperature is 20 {unit}." 98 99 100 tool = create_tool_from_function(get_current_weather) 101 ``` 102 103 Create a new instance of `VertexAIGeminiChatGenerator` to set the tools and a [ToolInvoker](../tools/toolinvoker.mdx) to invoke the tools.: 104 105 ```python 106 from haystack_integrations.components.generators.google_vertex import ( 107 VertexAIGeminiChatGenerator, 108 ) 109 from haystack.components.tools import ToolInvoker 110 111 gemini_chat = VertexAIGeminiChatGenerator(model="gemini-2.0-flash-exp", tools=[tool]) 112 113 tool_invoker = ToolInvoker(tools=[tool]) 114 ``` 115 116 And then ask our question: 117 118 ```python 119 from haystack.dataclasses import ChatMessage 120 121 messages = [ChatMessage.from_user("What is the temperature in celsius in Berlin?")] 122 res = gemini_chat.run(messages=messages) 123 124 print(res["replies"][0].tool_calls) 125 >>> [ToolCall(tool_name='get_current_weather', 126 >>> arguments={'unit': 'celsius', 'location': 'Berlin'}, id=None)] 127 128 tool_messages = tool_invoker.run(messages=replies)["tool_messages"] 129 messages = user_message + replies + tool_messages 130 131 messages += res["replies"][0] + [ChatMessage.from_function(content=weather, name="get_current_weather")] 132 133 final_replies = gemini_chat.run(messages=messages)["replies"] 134 print(final_replies[0].text) 135 >>> The temperature in Berlin is 20 degrees Celsius. 136 ``` 137 138 ### In a pipeline 139 140 ```python 141 from haystack.components.builders import ChatPromptBuilder 142 from haystack.dataclasses import ChatMessage 143 from haystack import Pipeline 144 from haystack_integrations.components.generators.google_vertex import VertexAIGeminiChatGenerator 145 146 ## no parameter init, we don't use any runtime template variables 147 prompt_builder = ChatPromptBuilder() 148 gemini_chat = VertexAIGeminiChatGenerator() 149 150 pipe = Pipeline() 151 pipe.add_component("prompt_builder", prompt_builder) 152 pipe.add_component("gemini", gemini) 153 pipe.connect("prompt_builder.prompt", "gemini.messages") 154 155 location = "Rome" 156 messages = [ChatMessage.from_user("Tell me briefly about {{location}} history")] 157 res = pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "template": messages}}) 158 159 print(res) 160 161 >>> - **753 B.C.:** Traditional date of the founding of Rome by Romulus and Remus. 162 >>> - **509 B.C.:** Establishment of the Roman Republic, replacing the Etruscan monarchy. 163 >>> - **492-264 B.C.:** Series of wars against neighboring tribes, resulting in the expansion of the Roman Republic's territory. 164 >>> - **264-146 B.C.:** Three Punic Wars against Carthage, resulting in the destruction of Carthage and the Roman Republic becoming the dominant power in the Mediterranean. 165 >>> - **133-73 B.C.:** Series of civil wars and slave revolts, leading to the rise of Julius Caesar. 166 >>> - **49 B.C.:** Julius Caesar crosses the Rubicon River, starting the Roman Civil War. 167 >>> - **44 B.C.:** Julius Caesar is assassinated, leading to the Second Triumvirate of Octavian, Mark Antony, and Lepidus. 168 >>> - **31 B.C.:** Battle of Actium, where Octavian defeats Mark Antony and Cleopatra, becoming the sole ruler of Rome. 169 >>> - **27 B.C.:** The Roman Republic is transformed into the Roman Empire, with Octavian becoming the first Roman emperor, known as Augustus. 170 >>> - **1st century A.D.:** The Roman Empire reaches its greatest extent, stretching from Britain to Egypt. 171 >>> - **3rd century A.D.:** The Roman Empire begins to decline, facing internal instability, invasions by Germanic tribes, and the rise of Christianity. 172 >>> - **476 A.D.:** The last Western Roman emperor, Romulus Augustulus, is overthrown by the Germanic leader Odoacer, marking the end of the Roman Empire in the West. 173 ``` 174 175 ## Additional References 176 177 π§βπ³ Cookbook: [Function Calling and Multimodal QA with Gemini](https://haystack.deepset.ai/cookbook/vertexai-gemini-examples)