google_ai.md
  1  ---
  2  title: "Google AI"
  3  id: integrations-google-ai
  4  description: "Google AI integration for Haystack"
  5  slug: "/integrations-google-ai"
  6  ---
  7  
  8  <a id="haystack_integrations.components.generators.google_ai.gemini"></a>
  9  
 10  ## Module haystack\_integrations.components.generators.google\_ai.gemini
 11  
 12  <a id="haystack_integrations.components.generators.google_ai.gemini.GoogleAIGeminiGenerator"></a>
 13  
 14  ### GoogleAIGeminiGenerator
 15  
 16  Generates text using multimodal Gemini models through Google AI Studio.
 17  
 18  ### Usage example
 19  
 20  ```python
 21  from haystack.utils import Secret
 22  from haystack_integrations.components.generators.google_ai import GoogleAIGeminiGenerator
 23  
 24  gemini = GoogleAIGeminiGenerator(model="gemini-2.0-flash", api_key=Secret.from_token("<MY_API_KEY>"))
 25  res = gemini.run(parts = ["What is the most interesting thing you know?"])
 26  for answer in res["replies"]:
 27      print(answer)
 28  ```
 29  
 30  #### Multimodal example
 31  
 32  ```python
 33  import requests
 34  from haystack.utils import Secret
 35  from haystack.dataclasses.byte_stream import ByteStream
 36  from haystack_integrations.components.generators.google_ai import GoogleAIGeminiGenerator
 37  
 38  BASE_URL = (
 39      "https://raw.githubusercontent.com/deepset-ai/haystack-core-integrations"
 40      "/main/integrations/google_ai/example_assets"
 41  )
 42  
 43  URLS = [
 44      f"{BASE_URL}/robot1.jpg",
 45      f"{BASE_URL}/robot2.jpg",
 46      f"{BASE_URL}/robot3.jpg",
 47      f"{BASE_URL}/robot4.jpg"
 48  ]
 49  images = [
 50      ByteStream(data=requests.get(url).content, mime_type="image/jpeg")
 51      for url in URLS
 52  ]
 53  
 54  gemini = GoogleAIGeminiGenerator(model="gemini-2.0-flash", api_key=Secret.from_token("<MY_API_KEY>"))
 55  result = gemini.run(parts = ["What can you tell me about this robots?", *images])
 56  for answer in result["replies"]:
 57      print(answer)
 58  ```
 59  
 60  <a id="haystack_integrations.components.generators.google_ai.gemini.GoogleAIGeminiGenerator.__init__"></a>
 61  
 62  #### GoogleAIGeminiGenerator.\_\_init\_\_
 63  
 64  ```python
 65  def __init__(*,
 66               api_key: Secret = Secret.from_env_var("GOOGLE_API_KEY"),
 67               model: str = "gemini-2.0-flash",
 68               generation_config: Optional[Union[GenerationConfig,
 69                                                 dict[str, Any]]] = None,
 70               safety_settings: Optional[dict[HarmCategory,
 71                                              HarmBlockThreshold]] = None,
 72               streaming_callback: Optional[Callable[[StreamingChunk],
 73                                                     None]] = None)
 74  ```
 75  
 76  Initializes a `GoogleAIGeminiGenerator` instance.
 77  
 78  To get an API key, visit: https://makersuite.google.com
 79  
 80  **Arguments**:
 81  
 82  - `api_key`: Google AI Studio API key.
 83  - `model`: Name of the model to use. For available models, see https://ai.google.dev/gemini-api/docs/models/gemini
 84  - `generation_config`: The generation configuration to use.
 85  This can either be a `GenerationConfig` object or a dictionary of parameters.
 86  For available parameters, see
 87  [the `GenerationConfig` API reference](https://ai.google.dev/api/python/google/generativeai/GenerationConfig).
 88  - `safety_settings`: The safety settings to use.
 89  A dictionary with `HarmCategory` as keys and `HarmBlockThreshold` as values.
 90  For more information, see [the API reference](https://ai.google.dev/api)
 91  - `streaming_callback`: A callback function that is called when a new token is received from the stream.
 92  The callback function accepts StreamingChunk as an argument.
 93  
 94  <a id="haystack_integrations.components.generators.google_ai.gemini.GoogleAIGeminiGenerator.to_dict"></a>
 95  
 96  #### GoogleAIGeminiGenerator.to\_dict
 97  
 98  ```python
 99  def to_dict() -> dict[str, Any]
100  ```
101  
102  Serializes the component to a dictionary.
103  
104  **Returns**:
105  
106  Dictionary with serialized data.
107  
108  <a id="haystack_integrations.components.generators.google_ai.gemini.GoogleAIGeminiGenerator.from_dict"></a>
109  
110  #### GoogleAIGeminiGenerator.from\_dict
111  
112  ```python
113  @classmethod
114  def from_dict(cls, data: dict[str, Any]) -> "GoogleAIGeminiGenerator"
115  ```
116  
117  Deserializes the component from a dictionary.
118  
119  **Arguments**:
120  
121  - `data`: Dictionary to deserialize from.
122  
123  **Returns**:
124  
125  Deserialized component.
126  
127  <a id="haystack_integrations.components.generators.google_ai.gemini.GoogleAIGeminiGenerator.run"></a>
128  
129  #### GoogleAIGeminiGenerator.run
130  
131  ```python
132  @component.output_types(replies=list[str])
133  def run(parts: Variadic[Union[str, ByteStream, Part]],
134          streaming_callback: Optional[Callable[[StreamingChunk], None]] = None)
135  ```
136  
137  Generates text based on the given input parts.
138  
139  **Arguments**:
140  
141  - `parts`: A heterogeneous list of strings, `ByteStream` or `Part` objects.
142  - `streaming_callback`: A callback function that is called when a new token is received from the stream.
143  
144  **Returns**:
145  
146  A dictionary containing the following key:
147  - `replies`: A list of strings containing the generated responses.
148  
149  <a id="haystack_integrations.components.generators.google_ai.chat.gemini"></a>
150  
151  ## Module haystack\_integrations.components.generators.google\_ai.chat.gemini
152  
153  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator"></a>
154  
155  ### GoogleAIGeminiChatGenerator
156  
157  Completes chats using Gemini models through Google AI Studio.
158  
159  It uses the [`ChatMessage`](https://docs.haystack.deepset.ai/docs/data-classes#chatmessage)
160    dataclass to interact with the model.
161  
162  ### Usage example
163  
164  ```python
165  from haystack.utils import Secret
166  from haystack.dataclasses.chat_message import ChatMessage
167  from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator
168  
169  
170  gemini_chat = GoogleAIGeminiChatGenerator(model="gemini-2.0-flash", api_key=Secret.from_token("<MY_API_KEY>"))
171  
172  messages = [ChatMessage.from_user("What is the most interesting thing you know?")]
173  res = gemini_chat.run(messages=messages)
174  for reply in res["replies"]:
175      print(reply.text)
176  
177  messages += res["replies"] + [ChatMessage.from_user("Tell me more about it")]
178  res = gemini_chat.run(messages=messages)
179  for reply in res["replies"]:
180      print(reply.text)
181  ```
182  
183  
184  #### With function calling:
185  
186  ```python
187  from typing import Annotated
188  from haystack.utils import Secret
189  from haystack.dataclasses.chat_message import ChatMessage
190  from haystack.components.tools import ToolInvoker
191  from haystack.tools import create_tool_from_function
192  
193  from haystack_integrations.components.generators.google_ai import GoogleAIGeminiChatGenerator
194  
195  # example function to get the current weather
196  def get_current_weather(
197      location: Annotated[str, "The city for which to get the weather, e.g. 'San Francisco'"] = "Munich",
198      unit: Annotated[str, "The unit for the temperature, e.g. 'celsius'"] = "celsius",
199  ) -> str:
200      return f"The weather in {location} is sunny. The temperature is 20 {unit}."
201  
202  tool = create_tool_from_function(get_current_weather)
203  tool_invoker = ToolInvoker(tools=[tool])
204  
205  gemini_chat = GoogleAIGeminiChatGenerator(
206      model="gemini-2.0-flash-exp",
207      api_key=Secret.from_token("<MY_API_KEY>"),
208      tools=[tool],
209  )
210  user_message = [ChatMessage.from_user("What is the temperature in celsius in Berlin?")]
211  replies = gemini_chat.run(messages=user_message)["replies"]
212  print(replies[0].tool_calls)
213  
214  # actually invoke the tool
215  tool_messages = tool_invoker.run(messages=replies)["tool_messages"]
216  messages = user_message + replies + tool_messages
217  
218  # transform the tool call result into a human readable message
219  final_replies = gemini_chat.run(messages=messages)["replies"]
220  print(final_replies[0].text)
221  ```
222  
223  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator.__init__"></a>
224  
225  #### GoogleAIGeminiChatGenerator.\_\_init\_\_
226  
227  ```python
228  def __init__(*,
229               api_key: Secret = Secret.from_env_var("GOOGLE_API_KEY"),
230               model: str = "gemini-2.0-flash",
231               generation_config: Optional[Union[GenerationConfig,
232                                                 dict[str, Any]]] = None,
233               safety_settings: Optional[dict[HarmCategory,
234                                              HarmBlockThreshold]] = None,
235               tools: Optional[list[Tool]] = None,
236               tool_config: Optional[content_types.ToolConfigDict] = None,
237               streaming_callback: Optional[StreamingCallbackT] = None)
238  ```
239  
240  Initializes a `GoogleAIGeminiChatGenerator` instance.
241  
242  To get an API key, visit: https://aistudio.google.com/
243  
244  **Arguments**:
245  
246  - `api_key`: Google AI Studio API key. To get a key,
247  see [Google AI Studio](https://aistudio.google.com/).
248  - `model`: Name of the model to use. For available models, see https://ai.google.dev/gemini-api/docs/models/gemini.
249  - `generation_config`: The generation configuration to use.
250  This can either be a `GenerationConfig` object or a dictionary of parameters.
251  For available parameters, see
252  [the API reference](https://ai.google.dev/api/generate-content).
253  - `safety_settings`: The safety settings to use.
254  A dictionary with `HarmCategory` as keys and `HarmBlockThreshold` as values.
255  For more information, see [the API reference](https://ai.google.dev/api/generate-content)
256  - `tools`: A list of tools for which the model can prepare calls.
257  - `tool_config`: The tool config to use. See the documentation for
258  [ToolConfig](https://ai.google.dev/api/caching#ToolConfig).
259  - `streaming_callback`: A callback function that is called when a new token is received from the stream.
260  The callback function accepts StreamingChunk as an argument.
261  
262  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator.to_dict"></a>
263  
264  #### GoogleAIGeminiChatGenerator.to\_dict
265  
266  ```python
267  def to_dict() -> dict[str, Any]
268  ```
269  
270  Serializes the component to a dictionary.
271  
272  **Returns**:
273  
274  Dictionary with serialized data.
275  
276  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator.from_dict"></a>
277  
278  #### GoogleAIGeminiChatGenerator.from\_dict
279  
280  ```python
281  @classmethod
282  def from_dict(cls, data: dict[str, Any]) -> "GoogleAIGeminiChatGenerator"
283  ```
284  
285  Deserializes the component from a dictionary.
286  
287  **Arguments**:
288  
289  - `data`: Dictionary to deserialize from.
290  
291  **Returns**:
292  
293  Deserialized component.
294  
295  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator.run"></a>
296  
297  #### GoogleAIGeminiChatGenerator.run
298  
299  ```python
300  @component.output_types(replies=list[ChatMessage])
301  def run(messages: list[ChatMessage],
302          streaming_callback: Optional[StreamingCallbackT] = None,
303          *,
304          tools: Optional[list[Tool]] = None)
305  ```
306  
307  Generates text based on the provided messages.
308  
309  **Arguments**:
310  
311  - `messages`: A list of `ChatMessage` instances, representing the input messages.
312  - `streaming_callback`: A callback function that is called when a new token is received from the stream.
313  - `tools`: A list of tools for which the model can prepare calls. If set, it will override the `tools` parameter set
314  during component initialization.
315  
316  **Returns**:
317  
318  A dictionary containing the following key:
319  - `replies`:  A list containing the generated responses as `ChatMessage` instances.
320  
321  <a id="haystack_integrations.components.generators.google_ai.chat.gemini.GoogleAIGeminiChatGenerator.run_async"></a>
322  
323  #### GoogleAIGeminiChatGenerator.run\_async
324  
325  ```python
326  @component.output_types(replies=list[ChatMessage])
327  async def run_async(messages: list[ChatMessage],
328                      streaming_callback: Optional[StreamingCallbackT] = None,
329                      *,
330                      tools: Optional[list[Tool]] = None)
331  ```
332  
333  Async version of the run method. Generates text based on the provided messages.
334  
335  **Arguments**:
336  
337  - `messages`: A list of `ChatMessage` instances, representing the input messages.
338  - `streaming_callback`: A callback function that is called when a new token is received from the stream.
339  - `tools`: A list of tools for which the model can prepare calls. If set, it will override the `tools` parameter set
340  during component initialization.
341  
342  **Returns**:
343  
344  A dictionary containing the following key:
345  - `replies`:  A list containing the generated responses as `ChatMessage` instances.
346