/ docs-website / versioned_docs / version-2.24 / pipeline-components / generators / anthropicgenerator.mdx
anthropicgenerator.mdx
1 --- 2 title: "AnthropicGenerator" 3 id: anthropicgenerator 4 slug: "/anthropicgenerator" 5 description: "This component enables text completions using Anthropic large language models (LLMs)." 6 --- 7 8 # AnthropicGenerator 9 10 This component enables text completions using Anthropic large language models (LLMs). 11 12 <div className="key-value-table"> 13 14 | | | 15 | --- | --- | 16 | **Most common position in a pipeline** | After a [PromptBuilder](../builders/promptbuilder.mdx) | 17 | **Mandatory init variables** | `api_key`: An Anthropic API key. Can be set with `ANTHROPIC_API_KEY` env var. | 18 | **Mandatory run variables** | `prompt`: A string containing the prompt for the LLM | 19 | **Output variables** | `replies`: A list of strings with all the replies generated by the LLM <br /> <br />`meta`: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on | 20 | **API reference** | [Anthropic](/reference/integrations-anthropic) | 21 | **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/anthropic | 22 23 </div> 24 25 ## Overview 26 27 This integration supports Anthropic models such as `claude-3-5-sonnet-20240620`,`claude-3-opus-20240229`, `claude-3-haiku-20240307`, and similar. Although these LLMs are called chat models, the main prompt interface works with the string prompts. Check out the most recent full list in the [Anthropic documentation](https://docs.anthropic.com/en/docs/about-claude/models). 28 29 ### Parameters 30 31 `AnthropicGenerator` needs an Anthropic API key to work. You can provide this key in: 32 33 - The `ANTHROPIC_API_KEY` environment variable (recommended) 34 - The `api_key` init parameter and Haystack [Secret](../../concepts/secret-management.mdx) API: `Secret.from_token("your-api-key-here")` 35 36 Set your preferred Anthropic model in the `model` parameter when initializing the component. 37 38 `AnthropicGenerator` requires a prompt to generate text, but you can pass any text generation parameters available in the Anthropic [Messaging API](https://docs.anthropic.com/en/api/messages) method directly to this component using the `generation_kwargs` parameter, both at initialization and to `run()` method. For more details on the parameters supported by the Anthropic API, see [Anthropic documentation](https://docs.anthropic.com). 39 40 Finally, the component run method requires a single string prompt to generate text. 41 42 ### Streaming 43 44 This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter. 45 46 ## Usage 47 48 Install the `anthropic-haystack` package to use the `AnthropicGenerator`: 49 50 ```shell 51 pip install anthropic-haystack 52 ``` 53 54 ### On its own 55 56 ```python 57 from haystack_integrations.components.generators.anthropic import AnthropicGenerator 58 59 generator = AnthropicGenerator() 60 print(generator.run("What's Natural Language Processing? Be brief.")) 61 ``` 62 63 ### In a pipeline 64 65 You can also use `AnthropicGenerator` with the Anthropic models in your pipeline. 66 67 ```python 68 from haystack import Pipeline 69 from haystack.components.builders import PromptBuilder 70 from haystack_integrations.components.generators.anthropic import AnthropicGenerator 71 from haystack.utils import Secret 72 73 template = """ 74 You are an assistant giving out valuable information to language learners. 75 Answer this question, be brief. 76 77 Question: {{ query }}? 78 """ 79 80 pipe = Pipeline() 81 pipe.add_component("prompt_builder", PromptBuilder(template)) 82 pipe.add_component("llm", AnthropicGenerator(Secret.from_env_var("ANTHROPIC_API_KEY"))) 83 pipe.connect("prompt_builder", "llm") 84 85 query = "What language is spoke in Germany?" 86 res = pipe.run(data={"prompt_builder": {"query": {query}}}) 87 print(res) 88 ```