coheregenerator.mdx
  1  ---
  2  title: "CohereGenerator"
  3  id: coheregenerator
  4  slug: "/coheregenerator"
  5  description: "`CohereGenerator` enables text generation using Cohere's large language models (LLMs)."
  6  ---
  7  
  8  # CohereGenerator
  9  
 10  `CohereGenerator` enables text generation using Cohere's large language models (LLMs).
 11  
 12  <div className="key-value-table">
 13  
 14  |  |  |
 15  | --- | --- |
 16  | **Most common position in a pipeline** | After a [`PromptBuilder`](../builders/promptbuilder.mdx) |
 17  | **Mandatory init variables** | `api_key`: The Cohere API key. Can be set with `COHERE_API_KEY` or `CO_API_KEY` env var. |
 18  | **Mandatory run variables** | `prompt`: A string containing the prompt for the LLM |
 19  | **Output variables** | `replies`: A list of strings with all the replies generated by the LLM  <br /> <br />`meta`: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on |
 20  | **API reference** | [Cohere](/reference/integrations-cohere) |
 21  | **GitHub link** | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cohere |
 22  
 23  </div>
 24  
 25   This integration supports Cohere models such as `command`, `command-r` and `comman-r-plus`. Check out the most recent full list in [Cohere documentation](https://docs.cohere.com/reference/chat).
 26  
 27  ## Overview
 28  
 29  `CohereGenerator` needs a Cohere API key to work. You can write this key in:
 30  
 31  - The `api_key` init parameter using [Secret API](../../concepts/secret-management.mdx)
 32  - The `COHERE_API_KEY` environment variable (recommended)
 33  
 34  Then, the component needs a prompt to operate, but you can pass any text generation parameters directly to this component using the `generation_kwargs` parameter at initialization. For more details on the parameters supported by the Cohere API, refer to the [Cohere documentation](https://docs.cohere.com/reference/chat).
 35  
 36  ### Streaming
 37  
 38  This Generator supports [streaming](guides-to-generators/choosing-the-right-generator.mdx#streaming-support) the tokens from the LLM directly in output. To do so, pass a function to the `streaming_callback` init parameter.
 39  
 40  ## Usage
 41  
 42  You need to install `cohere-haystack` package to use the  `CohereGenerator`:
 43  
 44  ```shell
 45  pip install cohere-haystack
 46  ```
 47  
 48  ### On its own
 49  
 50  Basic usage:
 51  
 52  ```python
 53  from haystack_integrations.components.generators.cohere import CohereGenerator
 54  
 55  client = CohereGenerator()
 56  response = client.run("Briefly explain what NLP is in one sentence.")
 57  print(response)
 58  
 59  >>> {'replies': ["Natural Language Processing (NLP) is a subfield of artificial intelligence and computational linguistics that focuses on the interaction between computers and human languages..."],
 60   'meta': [{'finish_reason': 'COMPLETE'}]}
 61  ```
 62  
 63  With streaming:
 64  
 65  ```python
 66  from haystack_integrations.components.generators.cohere import CohereGenerator
 67  
 68  client = CohereGenerator(streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
 69  response = client.run("Briefly explain what NLP is in one sentence.")
 70  print(response)
 71  
 72  >>> Natural Language Processing (NLP) is the study of natural language and how it can be used to solve problems through computational methods, enabling machines to understand, interpret, and generate human language.
 73  
 74  >>>{'replies': [' Natural Language Processing (NLP) is the study of natural language and how it can be used to solve problems through computational methods, enabling machines to understand, interpret, and generate human language.'], 'meta': [{'index': 0, 'finish_reason': 'COMPLETE'}]}
 75  
 76  ```
 77  
 78  ### In a pipeline
 79  
 80  In a RAG pipeline:
 81  
 82  ```python
 83  from haystack import Pipeline
 84  from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
 85  from haystack.components.builders.prompt_builder import PromptBuilder
 86  from haystack.document_stores.in_memory import InMemoryDocumentStore
 87  from haystack_integrations.components.generators.cohere import CohereGenerator
 88  from haystack import Document
 89  
 90  docstore = InMemoryDocumentStore()
 91  docstore.write_documents(
 92      [
 93          Document(content="Rome is the capital of Italy"),
 94          Document(content="Paris is the capital of France"),
 95      ],
 96  )
 97  
 98  query = "What is the capital of France?"
 99  
100  template = """
101  Given the following information, answer the question.
102  
103  Context:
104  {% for document in documents %}
105      {{ document.content }}
106  {% endfor %}
107  
108  Question: {{ query }}?
109  """
110  pipe = Pipeline()
111  
112  pipe.add_component("retriever", InMemoryBM25Retriever(document_store=docstore))
113  pipe.add_component("prompt_builder", PromptBuilder(template=template))
114  pipe.add_component("llm", CohereGenerator())
115  pipe.connect("retriever", "prompt_builder.documents")
116  pipe.connect("prompt_builder", "llm")
117  
118  res = pipe.run({"prompt_builder": {"query": query}, "retriever": {"query": query}})
119  
120  print(res)
121  ```