/ docs-website / versioned_docs / version-2.27 / pipeline-components / generators / watsonxgenerator.mdx
watsonxgenerator.mdx
1 --- 2 title: "WatsonxGenerator" 3 id: watsonxgenerator 4 slug: "/watsonxgenerator" 5 description: "Use this component with IBM watsonx models like `granite-3-2b-instruct` for simple text generation tasks." 6 --- 7 8 # WatsonxGenerator 9 10 Use this component with IBM watsonx models like `granite-3-2b-instruct` for simple text generation tasks. 11 12 <div className="key-value-table"> 13 14 | | | 15 | --- | --- | 16 | **Most common position in a pipeline** | After a [PromptBuilder](../builders/promptbuilder.mdx) | 17 | **Mandatory init variables** | `api_key`: An IBM Cloud API key. Can be set with `WATSONX_API_KEY` env var. <br /> <br />`project_id`: An IBM Cloud project ID. Can be set with `WATSONX_PROJECT_ID` env var. | 18 | **Mandatory run variables** | `prompt`: A string containing the prompt for the LLM | 19 | **Output variables** | `replies`: A list of strings with all the replies generated by the LLM <br /> <br />`meta`: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on | 20 | **API reference** | [Watsonx](/reference/integrations-watsonx) | 21 | **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/watsonx | 22 23 </div> 24 25 ## Overview 26 27 This integration supports IBM watsonx.ai foundation models such as `ibm/granite-13b-chat-v2`, `ibm/llama-2-70b-chat`, `ibm/llama-3-70b-instruct`, and similar. These models provide high-quality text generation capabilities through IBM's cloud platform. Check out the most recent full list in the [IBM watsonx.ai documentation](https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-models-ibm.html?context=wx). 28 29 ### Parameters 30 31 `WatsonxGenerator` needs IBM Cloud credentials to work. You can provide these in: 32 33 - The `WATSONX_API_KEY` environment variable (recommended) 34 - The `WATSONX_PROJECT_ID` environment variable (recommended) 35 - The `api_key` and `project_id` init parameters using Haystack [Secret](../../concepts/secret-management.mdx) API: `Secret.from_token("your-api-key-here")` 36 37 Set your preferred IBM watsonx.ai model in the `model` parameter when initializing the component. The default model is `ibm/granite-3-2b-instruct`. 38 39 `WatsonxGenerator` requires a prompt to generate text, but you can pass any text generation parameters available in the IBM watsonx.ai API directly to this component using the `generation_kwargs` parameter, both at initialization and to `run()` method. For more details on the parameters supported by the IBM watsonx.ai API, see [IBM watsonx.ai documentation](https://cloud.ibm.com/apidocs/watsonx-ai). 40 41 The component also supports system prompts that can be set at initialization or passed during runtime to provide context or instructions for the generation. 42 43 Finally, the component run method requires a single string prompt to generate text. 44 45 ### Streaming 46 47 This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter. 48 49 ## Usage 50 51 Install the `watsonx-haystack` package to use the `WatsonxGenerator`: 52 53 ```shell 54 pip install watsonx-haystack 55 ``` 56 57 ### On its own 58 59 ```python 60 from haystack_integrations.components.generators.watsonx.generator import ( 61 WatsonxGenerator, 62 ) 63 from haystack.utils import Secret 64 65 generator = WatsonxGenerator( 66 api_key=Secret.from_env_var("WATSONX_API_KEY"), 67 project_id=Secret.from_env_var("WATSONX_PROJECT_ID"), 68 ) 69 70 print(generator.run("What's Natural Language Processing? Be brief.")) 71 ``` 72 73 ### In a pipeline 74 75 You can also use `WatsonxGenerator` with the IBM watsonx.ai models in your pipeline. 76 77 ```python 78 from haystack import Pipeline 79 from haystack.components.builders import PromptBuilder 80 from haystack_integrations.components.generators.watsonx.generator import ( 81 WatsonxGenerator, 82 ) 83 from haystack.utils import Secret 84 85 template = """ 86 You are an assistant giving out valuable information to language learners. 87 Answer this question, be brief. 88 89 Question: {{ query }}? 90 """ 91 92 pipe = Pipeline() 93 pipe.add_component("prompt_builder", PromptBuilder(template)) 94 pipe.add_component( 95 "llm", 96 WatsonxGenerator( 97 api_key=Secret.from_env_var("WATSONX_API_KEY"), 98 project_id=Secret.from_env_var("WATSONX_PROJECT_ID"), 99 ), 100 ) 101 pipe.connect("prompt_builder", "llm") 102 103 query = "What language is spoken in Germany?" 104 res = pipe.run(data={"prompt_builder": {"query": query}}) 105 106 print(res) 107 ```