hayhooks.mdx
  1  ---
  2  title: "Hayhooks"
  3  id: hayhooks
  4  slug: "/hayhooks"
  5  description: "Hayhooks is a web application you can use to serve Haystack pipelines through HTTP endpoints. This page provides an overview of the main features of Hayhooks."
  6  ---
  7  
  8  # Hayhooks
  9  
 10  Hayhooks is a web application you can use to serve Haystack pipelines through HTTP endpoints. This page provides an overview of the main features of Hayhooks.
 11  
 12  :::info[Hayhooks GitHub]
 13  
 14  You can find the code and an in-depth explanation of the features in the [Hayhooks GitHub repository](https://github.com/deepset-ai/hayhooks).
 15  :::
 16  
 17  ## Overview
 18  
 19  Hayhooks simplifies the deployment of Haystack pipelines as REST APIs. It allows you to:
 20  
 21  - Expose Haystack pipelines as HTTP endpoints, including OpenAI-compatible chat endpoints,
 22  - Customize logic while keeping minimal boilerplate,
 23  - Deploy pipelines quickly and efficiently.
 24  
 25  ### Installation
 26  
 27  Install Hayhooks using pip:
 28  
 29  ```shell
 30  pip install hayhooks
 31  ```
 32  
 33  The `hayhooks` package ships both the server and the client component, and the client is capable of starting the server. From a shell, start the server with:
 34  
 35  ```shell
 36  $ hayhooks run
 37  INFO:     Started server process [44782]
 38  INFO:     Waiting for application startup.
 39  INFO:     Application startup complete.
 40  INFO:     Uvicorn running on http://localhost:1416 (Press CTRL+C to quit)
 41  ```
 42  
 43  ### Check Status
 44  
 45  From a different shell, you can query the status of the server with:
 46  
 47  ```shell
 48  $ hayhooks status
 49  Hayhooks server is up and running.
 50  ```
 51  
 52  ## Configuration
 53  
 54  Hayhooks can be configured in three ways:
 55  
 56  1. Using an `.env` file in the project root.
 57  2. Passing environment variables when running the command.
 58  3. Using command-line arguments with `hayhooks run`.
 59  
 60  ### Environment Variables
 61  
 62  <div className="key-value-table">
 63  
 64  |  |  |
 65  | --- | --- |
 66  | Variable                          | Description                        |
 67  | `HAYHOOKS_HOST`                   | Host address for the server        |
 68  | `HAYHOOKS_PORT`                   | Port for the server                |
 69  | `HAYHOOKS_PIPELINES_DIR`          | Directory containing pipelines     |
 70  | `HAYHOOKS_ROOT_PATH`              | Root path of the server            |
 71  | `HAYHOOKS_ADDITIONAL_PYTHON_PATH` | Additional Python paths to include |
 72  | `HAYHOOKS_DISABLE_SSL`            | Disable SSL verification (boolean) |
 73  | `HAYHOOKS_SHOW_TRACEBACKS`        | Show error tracebacks (boolean)    |
 74  
 75  </div>
 76  
 77  ### CORS Settings
 78  
 79  <div className="key-value-table">
 80  
 81  |  |  |
 82  | --- | --- |
 83  | Variable                           | Description                                         |
 84  | `HAYHOOKS_CORS_ALLOW_ORIGINS`      | List of allowed origins (default: `[*]`)            |
 85  | `HAYHOOKS_CORS_ALLOW_METHODS`      | List of allowed HTTP methods (default: `[*]`)       |
 86  | `HAYHOOKS_CORS_ALLOW_HEADERS`      | List of allowed headers (default: `[*]`)            |
 87  | `HAYHOOKS_CORS_ALLOW_CREDENTIALS`  | Allow credentials (default: `false`)                |
 88  | `HAYHOOKS_CORS_ALLOW_ORIGIN_REGEX` | Regex pattern for allowed origins (default: `null`) |
 89  | `HAYHOOKS_CORS_EXPOSE_HEADERS`     | Headers to expose in response (default: `[]`)       |
 90  | `HAYHOOKS_CORS_MAX_AGE`            | Max age for preflight responses (default: `600`)    |
 91  
 92  </div>
 93  
 94  ## Running Hayhooks
 95  
 96  To start the server:
 97  
 98  ```shell
 99  hayhooks run
100  ```
101  
102  This will launch Hayhooks at `HAYHOOKS_HOST:HAYHOOKS_PORT`.
103  
104  ## Deploying a Pipeline
105  
106  ### Steps
107  
108  1. Prepare a pipeline definition (`.yml` file) and a `pipeline_wrapper.py` file.
109  2. Deploy the pipeline:
110  
111     ```shell
112     hayhooks pipeline deploy-files -n my_pipeline my_pipeline_dir
113     ```
114  3. Access the pipeline at `{pipeline_name}/run` endpoint.
115  
116  ### Pipeline Wrapper
117  
118  A `PipelineWrapper` class is required to wrap the pipeline:
119  
120  ```python
121  from pathlib import Path
122  from haystack import Pipeline
123  from hayhooks import BasePipelineWrapper
124  
125  
126  class PipelineWrapper(BasePipelineWrapper):
127      def setup(self) -> None:
128          pipeline_yaml = (Path(__file__).parent / "pipeline.yml").read_text()
129          self.pipeline = Pipeline.loads(pipeline_yaml)
130  
131      def run_api(self, input_text: str) -> str:
132          result = self.pipeline.run({"input": {"text": input_text}})
133          return result["output"]["text"]
134  ```
135  
136  ## File Uploads
137  
138  Hayhooks enables handling file uploads in your pipeline wrapper’s `run_api` method by including `files: Optional[List[UploadFile]] = None` as an argument.
139  
140  ```python
141  def run_api(self, files: Optional[List[UploadFile]] = None) -> str:
142      if files and len(files) > 0:
143          filenames = [f.filename for f in files if f.filename is not None]
144          file_contents = [f.file.read() for f in files]
145          return f"Received files: {', '.join(filenames)}"
146      return "No files received"
147  ```
148  
149  Hayhooks automatically processes uploaded files and passes them to the `run_api` method when present. The HTTP request must be a `multipart/form-data` request.
150  
151  ### Combining Files and Parameters
152  
153  Hayhooks also supports handling both files and additional parameters in the same request by including them as arguments in `run_api`:
154  
155  ```python
156  def run_api(
157      self,
158      files: Optional[List[UploadFile]] = None,
159      additional_param: str = "default",
160  ) -> str: ...
161  ```
162  
163  ## Running Pipelines from the CLI
164  
165  ### With JSON-Compatible Parameters
166  
167  You can execute a pipeline through the command line using the `hayhooks pipeline run` command. Internally, this triggers the `run_api` method of the pipeline wrapper, passing parameters as a JSON payload.
168  
169  This method is ideal for testing deployed pipelines from the CLI without writing additional code.
170  
171  ```shell
172  hayhooks pipeline run <pipeline_name> --param 'question="Is this recipe vegan?"'
173  ```
174  
175  ### With File Uploads
176  
177  To execute a pipeline that requires a file input, use a `multipart/form-data` request. You can submit both files and parameters in the same request.
178  
179  Ensure the deployed pipeline supports file handling.
180  
181  ```shell
182  ## Upload a directory
183  hayhooks pipeline run <pipeline_name> --dir files_to_index
184  
185  ## Upload a single file
186  hayhooks pipeline run <pipeline_name> --file file.pdf
187  
188  ## Upload multiple files
189  hayhooks pipeline run <pipeline_name> --dir files_to_index --file file1.pdf --file file2.pdf
190  
191  ## Upload a file with an additional parameter
192  hayhooks pipeline run <pipeline_name> --file file.pdf --param 'question="Is this recipe vegan?"'
193  ```
194  
195  ## MCP Support
196  
197  ### MCP Server
198  
199  Hayhooks supports the Model Context Protocol (MCP) and can act as an MCP Server. It automatically lists your deployed pipelines as MCP Tools using Server-Sent Events (SSE) as the transport method.
200  
201  To start the Hayhooks MCP server, run:
202  
203  ```shell
204  hayhooks mcp run
205  ```
206  
207  This starts the server at `HAYHOOKS_MCP_HOST:HAYHOOKS_MCP_PORT`.
208  
209  ### Creating a PipelineWrapper
210  
211  To expose a Haystack pipeline as an MCP Tool, you need a `PipelineWrapper` with the following properties:
212  
213  - **name**: The tool's name
214  - **description**: The tool's description
215  - **inputSchema**: A JSON Schema object for the tool's input parameters
216  
217  For each deployed pipeline, Hayhooks will:
218  
219  1. Use the pipeline wrapper name as the MCP Tool name,
220  2. Use the `run_api` method's docstring as the MCP Tool description (if present),
221  3. Generate a Pydantic model from the `run_api` method arguments.
222  
223  #### PipelineWrapper Example
224  
225  ```python
226  from pathlib import Path
227  from typing import List
228  from haystack import Pipeline
229  from hayhooks import BasePipelineWrapper
230  
231  
232  class PipelineWrapper(BasePipelineWrapper):
233      def setup(self) -> None:
234          pipeline_yaml = (Path(__file__).parent / "chat_with_website.yml").read_text()
235          self.pipeline = Pipeline.loads(pipeline_yaml)
236  
237      def run_api(self, urls: List[str], question: str) -> str:
238          """
239          Ask a question about one or more websites using a Haystack pipeline.
240          """
241          result = self.pipeline.run(
242              {"fetcher": {"urls": urls}, "prompt": {"query": question}},
243          )
244          return result["llm"]["replies"][0]
245  ```
246  
247  ### Skipping MCP Tool Listing
248  
249  To deploy a pipeline without listing it as an MCP Tool, set `skip_mcp = True` in your class:
250  
251  ```python
252  class PipelineWrapper(BasePipelineWrapper):
253      # This will skip the MCP Tool listing
254      skip_mcp = True
255  
256      def setup(self) -> None: ...
257  
258      def run_api(self, urls: List[str], question: str) -> str: ...
259  ```
260  
261  ## OpenAI Compatibility
262  
263  Hayhooks supports OpenAI-compatible endpoints through the `run_chat_completion` method.
264  
265  ```python
266  from hayhooks import BasePipelineWrapper, get_last_user_message
267  
268  
269  class PipelineWrapper(BasePipelineWrapper):
270      def run_chat_completion(self, model: str, messages: list, body: dict):
271          question = get_last_user_message(messages)
272          return self.pipeline.run({"query": question})
273  ```
274  
275  ### Streaming Responses
276  
277  Hayhooks provides a `streaming_generator` utility to stream pipeline output to the client:
278  
279  ```python
280  from hayhooks import streaming_generator
281  
282  
283  def run_chat_completion(self, model: str, messages: list, body: dict):
284      question = get_last_user_message(messages)
285      return streaming_generator(
286          pipeline=self.pipeline,
287          pipeline_run_args={"query": question},
288      )
289  ```
290  
291  ## Running Programmatically
292  
293  Hayhooks can be embedded in a FastAPI application:
294  
295  ```python
296  import uvicorn
297  from hayhooks.settings import settings
298  from fastapi import Request
299  from hayhooks import create_app
300  
301  ## Create the Hayhooks app
302  hayhooks = create_app()
303  
304  
305  ## Add a custom route
306  @hayhooks.get("/custom")
307  async def custom_route():
308      return {"message": "Hi, this is a custom route!"}
309  
310  
311  ## Add a custom middleware
312  @hayhooks.middleware("http")
313  async def custom_middleware(request: Request, call_next):
314      response = await call_next(request)
315      response.headers["X-Custom-Header"] = "custom-header-value"
316      return response
317  
318  
319  if __name__ == "__main__":
320      uvicorn.run("app:hayhooks", host=settings.host, port=settings.port)
321  ```