/ README.md
README.md
  1  ![](./public/logo_light.png) 
  2  [![Version](https://img.shields.io/badge/Version-0.4.0-blue)](https://github.com/yourusername/chainlit-langgraph)
  3  [![Chainlit](https://img.shields.io/badge/Chainlit-2.8.3-brightgreen)](https://github.com/Chainlit/chainlit)
  4  [![LangGraph](https://img.shields.io/badge/LangGraph-1.0.1-brightgreen)](https://github.com/langchain-ai/langgraph)
  5  [![License](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
  6  
  7  [!["Buy Me A Coffee"](https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png)](https://buymeacoffee.com/brucechou1x)
  8  
  9  Rapidly build and deploy production-ready conversational AI agents using Chainlit and LangGraph. This powerful integration combines state-of-the-art language models with flexible workflow management, enabling developers to create sophisticated chatbots, virtual assistants, and interactive AI applications in minutes.
 10  
 11  Find a way to run your open source models? This is a free tool for you.
 12  
 13  ![Demo](./resource/screenshot.gif)
 14  
 15  ## Table of Contents
 16  - [Table of Contents](#table-of-contents)
 17  - [**Why This Project?**](#why-this-project)
 18  - [**Features**](#features)
 19  - [**Getting Started**](#getting-started)
 20    - [Setting up Ollama (Optional)](#setting-up-ollama-optional)
 21  - [**Creating Custom Workflow**](#creating-custom-workflow)
 22  - [**Workflows**](#workflows)
 23    - [Simple Chat Workflow](#simple-chat-workflow)
 24    - [Multimodal Chat Workflow](#multimodal-chat-workflow)
 25    - [Resume Optimizer](#resume-optimizer)
 26    - [Lean Canvas Chat](#lean-canvas-chat)
 27  - [Upcoming Features](#upcoming-features)
 28  
 29  ## **Why This Project?**
 30  [Chainlit](https://github.com/Chainlit/chainlit) is a powerful tool for building production-ready conversational AI applications. [LangGraph](https://github.com/langchain-ai/langgraph), on the other hand, is a versatile framework for building and managing state graphs in AI applications. This project combines these two to provide a comprehensive solution for building conversational AI agents, in minutes.
 31  
 32  ## **Features**
 33  - **Building Blocks**: Utilize a variety of building blocks to create your own conversational AI agents.
 34  - **Multiple LLM Support**: Automatically detects and uses the following LLMs:
 35    - **Ollama**: Open source model.
 36    - **Claude**: Advanced AI models by Anthropic. [Apply API Key Here](https://console.anthropic.com/account/keys)
 37    - **GPT**: Advanced AI models by OpenAI. [Apply API Key Here](https://platform.openai.com/settings/organization/api-keys)
 38    - **Grok**: Grok models by xAI. [Apply API Key Here](https://docs.x.ai/docs/quickstart#creating-an-api-key)
 39    - **Groq**: Fast inference service by Groq. [Apply API Key Here](https://console.groq.com/keys)
 40    - **Gemini**: Google AI models. [Apply API Key Here](https://aistudio.google.com/app/apikey)
 41  - **Examples**: Explore a variety of use cases with conversational AI agents examples.
 42  
 43  ## **Getting Started**
 44  Follow these steps to set up and run the project using Docker Compose or in your Python 3.10 virtual environment.
 45  
 46  1. Make sure you have Docker and Docker Compose installed on your system.
 47  2. Clone this repository and navigate to the project directory.
 48  3. Copy the `.env.example` file to `.env` and update the necessary environment variables:
 49  
 50  ```bash
 51  cp .env.example .env
 52  ```
 53  
 54  4. Edit the `.env` file and set the required variables, including:
 55    - **API keys** (`OPENAI_API_KEY`, `ANTHROPIC_API_KEY`): Optional if you use **Ollama**.
 56    - **DB volume settings** (`POSTGRES_VOLUME_PATH`, `MINIO_VOLUME_PATH`): create mount folders on your host machine and set the paths accordingly.
 57    - (Optional) `TAVILY_API_KEY` for enabling search
 58    - (Optional) Google OAuth
 59    - (Optional) LangSmith
 60  
 61  5. Start the services using Docker Compose
 62  
 63  ```bash
 64  docker compose up
 65  ```
 66  
 67  This will start all the necessary services, including the Chainlit application, PostgreSQL database, and MinIO object storage.
 68  
 69  6. The application should now be running at http://localhost:8000. Log in with the default username and password (admin:admin). You can change the default credentials in the `.env` file.
 70  
 71  ### Setting up Ollama (Optional)
 72  
 73  1. Download and install [Ollama](https://ollama.com).
 74  2. Pull whatever model you want to use, for example: 
 75  
 76  ```bash
 77  ollama pull cas/ministral-8b-instruct-2410_q4km:latest
 78  ollama pull llama3.2:3b-instruct-q8_0
 79  ```
 80  
 81  of any gguf-based model on the [HuggingFace](https://huggingface.co/docs/hub/ollama).
 82  
 83  ```bash
 84  ollama run hf.co/{username}/{repository}:{quantization}
 85  ```
 86  
 87  ![](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/ollama/guide.png)
 88  
 89  ## **Creating Custom Workflow**
 90  Creating your own custom workflow allows you to tailor the application to your specific needs. Follow the step-by-step guide below to create your own workflow.
 91  
 92  1. Go to the `chat_workflow/workflows` directory in your project, and create a new Python file for your workflow, e.g., `my_custom_workflow.py`.
 93  2. Define Your State Class
 94    - Inherit from `BaseState` to define the state variables your workflow will use. For example:
 95    ```python
 96    class MyCustomState(BaseState):
 97      # Model name of the chatbot
 98      chat_model: str
 99      # Add other state variables as needed
100    ```
101  3. Define Your Workflow
102    - Inherit from `BaseWorkflow` to define your custom workflow logic, and override the `create_graph` method to define the state graph.
103    ```python
104    class MyCustomWorkflow(BaseWorkflow):
105      def create_graph(self) -> StateGraph:
106          # LangGraph graph definition
107          graph = StateGraph(MyCustomState)
108          # Add nodes to the graph
109          graph.add_node("chat", self.chat_node)
110          # Add edges between nodes
111          graph.add_edge("chat", END)
112          # Set the entry point of the graph
113          graph.set_entry_point("chat")
114          return graph
115    ```
116    - Define node methods like `self.chat_node` in the `create_graph` method.
117    - Define default state by overriding the `get_default_state` method.
118    ```python
119    def create_default_state(self) -> MyCustomState:
120      return {
121          "name": self.name(),
122          "messages": [],
123          "chat_model": "",
124          # Initialize other state variables if needed
125      }
126    ```
127    - Set workflow properties.
128      - name: The display name of the workflow. For example, "My Custom Workflow".
129      - output_chat_model: The name of the LLM model to provide final output as a response.
130      - chat_profile: The profile for the workflow.
131      - starter: The starter message for the workflow.
132  
133  ## **Workflows**
134  This project includes several pre-built workflows to demonstrate the capabilities of the Chainlit Langgraph integration:
135  
136  ### [Simple Chat Workflow](./chat_workflow/workflows/simple_chat.py)
137  Located in `simple_chat.py`, this workflow provides a basic chatbot experience:
138  - Utilizes a state graph with chat and tool nodes
139  - Supports multiple language models
140  - Includes basic tools like datetime and web search
141  
142  ### [Multimodal Chat Workflow](./chat_workflow/workflows/multimodal_chat.py)
143  - Supports images and text inputs
144  
145  ### [Resume Optimizer](./chat_workflow/workflows/resume_optimizer.py)
146  Found in `resume_optimizer.py`, this workflow helps users improve their resumes:
147  - Features a resume extractor node to process uploaded PDF resumes
148  - Provides detailed analysis and suggestions for resume improvement
149  
150  ### [Lean Canvas Chat](./chat_workflow/workflows/lean_canvas_chat.py)
151  Implemented in `lean_canvas_chat.py`, this workflow assists in business modeling:
152  - Guides users through the Lean Canvas creation process
153  - Offers a structured approach to defining business models
154  
155  Each workflow demonstrates different aspects of the Chainlit Langgraph integration, showcasing its flexibility and power in creating AI-driven applications.
156  
157  
158  ## Upcoming Features
159  - **Model Context Protocol**: An open [protocol](https://modelcontextprotocol.io) that enables seamless integration between LLM applications and external data sources and tools. Open sourced by Anthropic.
160  - **Research Assistant**: A research assistant that can help users with their general research tasks, like NotebookLM.
161  - **NVIDIA NIM**: Self-host GPU-accelerated inferencing microservices for pretrained and customized AI models across clouds, data centers, and workstations.
162  - **Cloud Deployment**: Easy deployment of the application to cloud platforms like AWS, Azure, or GCP.
163  - **Graph Builder**: A meta-workflow builder that allows users to create custom workflows with natural language.
164  - **OpenAI o1-like agentic workflow**: Advanced self-prompting agentic workflow.
165  - **Image Generation**: Generate images based on user input.