/ vercel / README.md
README.md
 1  # MCP UI with Vercel AI SDK
 2  
 3  Start an MCP UI application that uses the [Vercel AI SDK] to provide a chat interface for local models,
 4  provided by the [Docker Model Runner], with access to MCPs from the [Docker MCP Catalog].
 5  
 6  The application will start up with two models loaded (qwen3 and llama3.2), which both support tool
 7  calling. See the [./compose.yaml](./compose.yaml) file for examples of how to add more models.
 8  
 9  The application also starts with a connection to the Docker MCP Gateway, which has been configured to
10  provide access to two MCPs (Brave and Wikipedia).  See the [./compose.yaml](./compose.yaml) file for
11  examples of how to provide access to more MCPs.
12  
13  # Getting Started
14  
15  ### Requirements
16  
17  + **[Docker Desktop] 4.43.0+ or [Docker Engine]** installed.
18  + **A laptop or workstation with a GPU** (e.g., a MacBook) for running open models locally. If you
19    don't have a GPU, you can alternatively use **[Docker Offload]**.
20  + If you're using [Docker Engine] on Linux or [Docker Desktop] on Windows, ensure that the
21    [Docker Model Runner requirements] are met (specifically that GPU
22    support is enabled) and the necessary drivers are installed.
23  + If you're using Docker Engine on Linux, ensure you have [Docker Compose] 2.38.1 or later installed.
24  
25  ### Configure MCP secrets
26  
27  This demo uses the Brave MCP, which requires an API key.  You can create a free api key at the [Brave Search api console](https://api-dashboard.search.brave.com/login).
28  
29  ```sh
30  docker mcp secret set 'brave.api_key=<insert your Brave Search API key here>'
31  ```
32  
33  ### Clone the project repository
34  
35  ```sh
36  git clone git@github.com:slimslenderslacks/scira-mcp-chat.git
37  cd scira-mcp-chat
38  # create a blank .mcp.env for now (will remove this step once cloud has secret support)
39  touch .mcp.env
40  ```
41  
42  ### Run the project locally
43  
44  ```sh
45  docker compose up --build
46  ```
47  
48  Access the MCP UI at [http://localhost:3000](http://localhost:3000).
49  
50  # What can it do?
51  
52  Choose one of the two local models loaded by compose.yaml, and request that it do something with either
53  Brave Search, or the Wikipedia tools.  For example:
54  
55  > do a wikipedia search for articles about Docker and MCP
56  
57  ### Run the project in Docker Cloud
58  
59  ```sh
60  # only required temporarily to support Cloud secrets
61  docker mcp secret export brave > .mcp.env
62  
63  # compose.cloud.yaml still has one small diff from the local one.
64  docker compose up --build
65  ```
66  
67  # Project Structure
68  
69  | File/Folder    | Purpose                                                                   |
70  | -------------- | ------------------------------------------------------------------------- |
71  | `compose.yaml`                              | Defines available models and MCPs           |
72  | `Dockerfile`                                | Builds MCP UI application                                       |
73  | `Dockerfile.initialize-chat-store-schema`   | Builds a container that initializes a postgres Schema for the app                                         |
74  
75  # Cleanup
76  
77  ```sh
78  docker compose down
79  ```
80  
81  # Credits
82  
83  + [Vercel AI SDK]
84  + [Docker MCP Toolkit]
85  + [Docker MCP Catalog]
86  
87  [Vercel AI SDK]: https://ai-sdk.dev/docs/introduction
88  [Docker MCP Toolkit]: https://docs.docker.com/ai/mcp-catalog-and-toolkit/toolkit/
89  [Docker MCP Catalog]: https://hub.docker.com/mcp
90  [Docker Compose]: https://github.com/docker/compose
91  [Docker Desktop]: https://www.docker.com/products/docker-desktop/
92  [Docker Engine]: https://docs.docker.com/engine/
93  [Docker Model Runner]: https://docs.docker.com/ai/model-runner/
94  [Docker Model Runner requirements]: https://docs.docker.com/ai/model-runner/
95  [Docker Offload]: https://www.docker.com/products/docker-offload/