outlines_llama-cpp-python_knowledge-graph-extraction.ipynb
1 { 2 "cells": [ 3 { 4 "cell_type": "markdown", 5 "id": "10d66530-b4e2-4b47-9413-92240069c5e1", 6 "metadata": {}, 7 "source": [ 8 "# Knowledge Graph Extraction" 9 ] 10 }, 11 { 12 "cell_type": "markdown", 13 "id": "2652e68d-4278-4e8f-9132-6ba90905d073", 14 "metadata": {}, 15 "source": [ 16 "In this guide, we use [outlines](https://outlines-dev.github.io/outlines/) extract a knowledge graph from unstructured text with the quantized `Hermes-2-Pro-Llama-3-8B`." 17 ] 18 }, 19 { 20 "cell_type": "markdown", 21 "id": "14f30dda-18f6-415e-ac23-e87aeb636f83", 22 "metadata": {}, 23 "source": [ 24 "## Requirements\n", 25 "\n", 26 "### Install llama-cpp-python and outlines" 27 ] 28 }, 29 { 30 "cell_type": "code", 31 "execution_count": 1, 32 "id": "467c6655-4dd6-4f4e-ad9b-42fc1de0f52c", 33 "metadata": { 34 "execution": { 35 "iopub.execute_input": "2024-08-08T18:04:58.229304Z", 36 "iopub.status.busy": "2024-08-08T18:04:58.228803Z", 37 "iopub.status.idle": "2024-08-08T18:04:58.234608Z", 38 "shell.execute_reply": "2024-08-08T18:04:58.233551Z", 39 "shell.execute_reply.started": "2024-08-08T18:04:58.229253Z" 40 } 41 }, 42 "outputs": [], 43 "source": [ 44 "# RUN IT ONLY ONCE TO INSTALL THE REQUIREMENTS\n", 45 "# %pip install llama-cpp-python outlines" 46 ] 47 }, 48 { 49 "cell_type": "markdown", 50 "id": "3da62069-96d7-43e5-8968-64b48bc1384b", 51 "metadata": {}, 52 "source": [ 53 "For detailed installation instructions, see [llama-cpp-python installation](https://llama-cpp-python.readthedocs.io/en/stable/) and [outlines installation](https://outlines-dev.github.io/outlines/installation/)\n", 54 "\n", 55 "### Pull the model from HuggingFace\n", 56 "\n", 57 "Download a GGUF model from HuggingFace [here](https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF/tree/main), for example, the Q4_K_M one (it requires 4.92 GB):" 58 ] 59 }, 60 { 61 "cell_type": "code", 62 "execution_count": 2, 63 "id": "ba6e9b01-0ad9-4f40-ac99-2e9340c1d3b1", 64 "metadata": { 65 "execution": { 66 "iopub.execute_input": "2024-08-08T18:04:58.236585Z", 67 "iopub.status.busy": "2024-08-08T18:04:58.235932Z", 68 "iopub.status.idle": "2024-08-08T18:04:58.260585Z", 69 "shell.execute_reply": "2024-08-08T18:04:58.259247Z", 70 "shell.execute_reply.started": "2024-08-08T18:04:58.236542Z" 71 } 72 }, 73 "outputs": [], 74 "source": [ 75 "# RUN IT ONLY ONCE TO DOWNLOAD THE GGUF MODEL, IN THIS CASE THE Q4_K_M\n", 76 "# !wget https://hf.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF/resolve/main/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf" 77 ] 78 }, 79 { 80 "cell_type": "markdown", 81 "id": "467fa3ae-28bf-4636-9f23-92b3204df17d", 82 "metadata": {}, 83 "source": [ 84 "## Usage\n", 85 "\n", 86 "### Knowledge Graph Extraction\n", 87 "\n", 88 "### Define Pydantic class\n", 89 "\n", 90 "We first need to define our Pydantic class for each node and each edge of the knowledge graph:" 91 ] 92 }, 93 { 94 "cell_type": "code", 95 "execution_count": 3, 96 "id": "b1ec8443-533b-41a5-9cfd-7e3ef23c57fa", 97 "metadata": { 98 "execution": { 99 "iopub.execute_input": "2024-08-08T18:04:58.262654Z", 100 "iopub.status.busy": "2024-08-08T18:04:58.262170Z", 101 "iopub.status.idle": "2024-08-08T18:04:58.389248Z", 102 "shell.execute_reply": "2024-08-08T18:04:58.388288Z", 103 "shell.execute_reply.started": "2024-08-08T18:04:58.262605Z" 104 } 105 }, 106 "outputs": [], 107 "source": [ 108 "from pydantic import BaseModel, Field\n", 109 "\n", 110 "class Node(BaseModel):\n", 111 " \"\"\"Node of the Knowledge Graph\"\"\"\n", 112 "\n", 113 " id: int = Field(..., description=\"Unique identifier of the node\")\n", 114 " label: str = Field(..., description=\"Label of the node\")\n", 115 " property: str = Field(..., description=\"Property of the node\")\n", 116 "\n", 117 "\n", 118 "class Edge(BaseModel):\n", 119 " \"\"\"Edge of the Knowledge Graph\"\"\"\n", 120 "\n", 121 " source: int = Field(..., description=\"Unique source of the edge\")\n", 122 " target: int = Field(..., description=\"Unique target of the edge\")\n", 123 " label: str = Field(..., description=\"Label of the edge\")\n", 124 " property: str = Field(..., description=\"Property of the edge\")" 125 ] 126 }, 127 { 128 "cell_type": "markdown", 129 "id": "df2766c4-9c46-4fc8-a485-3b322da761a6", 130 "metadata": {}, 131 "source": [ 132 "We then define the Pydantic class for the knowledge graph" 133 ] 134 }, 135 { 136 "cell_type": "code", 137 "execution_count": 4, 138 "id": "8a2712e4-57d9-4466-abda-87cf7bd29f0f", 139 "metadata": { 140 "execution": { 141 "iopub.execute_input": "2024-08-08T18:04:58.390871Z", 142 "iopub.status.busy": "2024-08-08T18:04:58.390385Z", 143 "iopub.status.idle": "2024-08-08T18:04:58.400975Z", 144 "shell.execute_reply": "2024-08-08T18:04:58.399576Z", 145 "shell.execute_reply.started": "2024-08-08T18:04:58.390835Z" 146 } 147 }, 148 "outputs": [], 149 "source": [ 150 "from typing import List\n", 151 "\n", 152 "class KnowledgeGraph(BaseModel):\n", 153 " \"\"\"Generated Knowledge Graph\"\"\"\n", 154 "\n", 155 " nodes: List[Node] = Field(..., description=\"List of nodes of the knowledge graph\")\n", 156 " edges: List[Edge] = Field(..., description=\"List of edges of the knowledge graph\")" 157 ] 158 }, 159 { 160 "cell_type": "markdown", 161 "id": "83358efd-2411-4383-962e-109b9d8afcc8", 162 "metadata": {}, 163 "source": [ 164 "### Load the model" 165 ] 166 }, 167 { 168 "cell_type": "code", 169 "execution_count": 5, 170 "id": "64f2fe73-56f7-4533-9357-394d5c6555dd", 171 "metadata": { 172 "execution": { 173 "iopub.execute_input": "2024-08-08T18:04:58.403495Z", 174 "iopub.status.busy": "2024-08-08T18:04:58.402478Z", 175 "iopub.status.idle": "2024-08-08T18:05:04.124226Z", 176 "shell.execute_reply": "2024-08-08T18:05:04.123430Z", 177 "shell.execute_reply.started": "2024-08-08T18:04:58.403442Z" 178 } 179 }, 180 "outputs": [ 181 { 182 "name": "stderr", 183 "output_type": "stream", 184 "text": [ 185 "Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.\n" 186 ] 187 } 188 ], 189 "source": [ 190 "import llama_cpp\n", 191 "from llama_cpp import Llama\n", 192 "from outlines import generate, models\n", 193 "\n", 194 "llm = Llama(\n", 195 " \"/big_storage/llms/models/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf\",\n", 196 " tokenizer=llama_cpp.llama_tokenizer.LlamaHFTokenizer.from_pretrained(\n", 197 " \"NousResearch/Hermes-2-Pro-Llama-3-8B\"\n", 198 " ),\n", 199 " n_gpu_layers=-1,\n", 200 " flash_attn=True,\n", 201 " n_ctx=8192,\n", 202 " verbose=False\n", 203 ")\n", 204 "\n", 205 "model = models.LlamaCpp(llm)" 206 ] 207 }, 208 { 209 "cell_type": "code", 210 "execution_count": 6, 211 "id": "b33f0a08-a699-4682-a50a-e5b21acb7645", 212 "metadata": { 213 "execution": { 214 "iopub.execute_input": "2024-08-08T18:05:04.126037Z", 215 "iopub.status.busy": "2024-08-08T18:05:04.125778Z", 216 "iopub.status.idle": "2024-08-08T18:05:04.129996Z", 217 "shell.execute_reply": "2024-08-08T18:05:04.128974Z", 218 "shell.execute_reply.started": "2024-08-08T18:05:04.126017Z" 219 } 220 }, 221 "outputs": [], 222 "source": [ 223 "import warnings\n", 224 "warnings.filterwarnings(\"ignore\", category=RuntimeWarning) # ignore runtime warnings" 225 ] 226 }, 227 { 228 "cell_type": "markdown", 229 "id": "91149a2f-0d15-4d8f-8827-73a15a38464f", 230 "metadata": {}, 231 "source": [ 232 "We build a regex from the `KnowledgeGraph` Pydantic class which the model will be forced to follow" 233 ] 234 }, 235 { 236 "cell_type": "code", 237 "execution_count": 7, 238 "id": "a125eb20-efb2-4274-a42d-a35f58d9db54", 239 "metadata": { 240 "execution": { 241 "iopub.execute_input": "2024-08-08T18:05:04.132103Z", 242 "iopub.status.busy": "2024-08-08T18:05:04.131434Z", 243 "iopub.status.idle": "2024-08-08T18:05:04.169395Z", 244 "shell.execute_reply": "2024-08-08T18:05:04.168357Z", 245 "shell.execute_reply.started": "2024-08-08T18:05:04.132046Z" 246 } 247 }, 248 "outputs": [ 249 { 250 "data": { 251 "text/plain": [ 252 "'\\\\{[ ]?\"nodes\"[ ]?:[ ]?\\\\[[ ]?((\\\\{[ ]?\"id\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"label\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?,[ ]?\"property\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?\\\\})(,[ ]?(\\\\{[ ]?\"id\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"label\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?,[ ]?\"property\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?\\\\})){0,})?[ ]?\\\\][ ]?,[ ]?\"edges\"[ ]?:[ ]?\\\\[[ ]?((\\\\{[ ]?\"source\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"target\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"label\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?,[ ]?\"property\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?\\\\})(,[ ]?(\\\\{[ ]?\"source\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"target\"[ ]?:[ ]?(-)?(0|[1-9][0-9]*)[ ]?,[ ]?\"label\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?,[ ]?\"property\"[ ]?:[ ]?\"([^\"\\\\\\\\\\\\x00-\\\\x1F\\\\x7F-\\\\x9F]|\\\\\\\\[\"\\\\\\\\])*\"[ ]?\\\\})){0,})?[ ]?\\\\][ ]?\\\\}'" 253 ] 254 }, 255 "execution_count": 7, 256 "metadata": {}, 257 "output_type": "execute_result" 258 } 259 ], 260 "source": [ 261 "from outlines.integrations.utils import convert_json_schema_to_str\n", 262 "from outlines.fsm.json_schema import build_regex_from_schema\n", 263 "\n", 264 "json_schema = KnowledgeGraph.model_json_schema()\n", 265 "schema_str = convert_json_schema_to_str(json_schema=json_schema)\n", 266 "regex_str = build_regex_from_schema(schema_str)\n", 267 "regex_str" 268 ] 269 }, 270 { 271 "cell_type": "markdown", 272 "id": "a853e5b6-1a40-43aa-a03d-4d5ece02b9c7", 273 "metadata": {}, 274 "source": [ 275 "We then need to adapt our prompt to the [Hermes prompt format for JSON schema](https://github.com/NousResearch/Hermes-Function-Calling?tab=readme-ov-file#prompt-format-for-json-mode--structured-outputs)" 276 ] 277 }, 278 { 279 "cell_type": "code", 280 "execution_count": 8, 281 "id": "3fddb1a4-fc3d-4a81-bfb6-ab07c94c16e2", 282 "metadata": { 283 "execution": { 284 "iopub.execute_input": "2024-08-08T18:05:04.171390Z", 285 "iopub.status.busy": "2024-08-08T18:05:04.170807Z", 286 "iopub.status.idle": "2024-08-08T18:05:04.177013Z", 287 "shell.execute_reply": "2024-08-08T18:05:04.175983Z", 288 "shell.execute_reply.started": "2024-08-08T18:05:04.171342Z" 289 } 290 }, 291 "outputs": [], 292 "source": [ 293 "def generate_hermes_prompt(user_prompt):\n", 294 " return (\n", 295 " \"<|im_start|>system\\n\"\n", 296 " \"You are a world class AI model who answers questions in JSON with correct Pydantic schema. \"\n", 297 " \"Here's the json schema you must adhere to:\\n<schema>\\n\" + str(json_schema) + \"\\n</schema>\"\n", 298 " \"\\n<|im_start|>user\\n\" + user_prompt + \"<|im_end|>\"\n", 299 " \"\\n<|im_start|>assistant\\n\"\n", 300 " )" 301 ] 302 }, 303 { 304 "cell_type": "code", 305 "execution_count": 9, 306 "id": "a5de546b-8dab-4125-a478-ceee0eaa3225", 307 "metadata": { 308 "execution": { 309 "iopub.execute_input": "2024-08-08T18:05:04.179280Z", 310 "iopub.status.busy": "2024-08-08T18:05:04.178432Z", 311 "iopub.status.idle": "2024-08-08T18:05:04.200243Z", 312 "shell.execute_reply": "2024-08-08T18:05:04.198925Z", 313 "shell.execute_reply.started": "2024-08-08T18:05:04.179233Z" 314 } 315 }, 316 "outputs": [], 317 "source": [ 318 "def generate_hermes_prompt(user_prompt):\n", 319 " return (\n", 320 " \"<|im_start|>system\\n\"\n", 321 " \"You are a world class AI model who answers questions in JSON \"\n", 322 " f\"Here's the json schema you must adhere to:\\n<schema>\\n{json_schema}\\n</schema><|im_end|>\\n\"\n", 323 " \"<|im_start|>user\\n\"\n", 324 " + user_prompt\n", 325 " + \"<|im_end|>\"\n", 326 " + \"\\n<|im_start|>assistant\\n\"\n", 327 " \"<schema>\"\n", 328 " )" 329 ] 330 }, 331 { 332 "cell_type": "markdown", 333 "id": "133966dc-642a-4c64-983a-66f0ece78b2b", 334 "metadata": {}, 335 "source": [ 336 "For a given `user_prompt` we obtain the hermes prompt" 337 ] 338 }, 339 { 340 "cell_type": "code", 341 "execution_count": 10, 342 "id": "e64535be-9597-49c9-a8e9-6d0598c74ec2", 343 "metadata": { 344 "execution": { 345 "iopub.execute_input": "2024-08-08T18:05:04.202188Z", 346 "iopub.status.busy": "2024-08-08T18:05:04.201721Z", 347 "iopub.status.idle": "2024-08-08T18:05:04.218040Z", 348 "shell.execute_reply": "2024-08-08T18:05:04.216790Z", 349 "shell.execute_reply.started": "2024-08-08T18:05:04.202142Z" 350 } 351 }, 352 "outputs": [ 353 { 354 "name": "stdout", 355 "output_type": "stream", 356 "text": [ 357 "<|im_start|>system\n", 358 "You are a world class AI model who answers questions in JSON Here's the json schema you must adhere to:\n", 359 "<schema>\n", 360 "{'$defs': {'Edge': {'description': 'Edge of the Knowledge Graph', 'properties': {'source': {'description': 'Unique source of the edge', 'title': 'Source', 'type': 'integer'}, 'target': {'description': 'Unique target of the edge', 'title': 'Target', 'type': 'integer'}, 'label': {'description': 'Label of the edge', 'title': 'Label', 'type': 'string'}, 'property': {'description': 'Property of the edge', 'title': 'Property', 'type': 'string'}}, 'required': ['source', 'target', 'label', 'property'], 'title': 'Edge', 'type': 'object'}, 'Node': {'description': 'Node of the Knowledge Graph', 'properties': {'id': {'description': 'Unique identifier of the node', 'title': 'Id', 'type': 'integer'}, 'label': {'description': 'Label of the node', 'title': 'Label', 'type': 'string'}, 'property': {'description': 'Property of the node', 'title': 'Property', 'type': 'string'}}, 'required': ['id', 'label', 'property'], 'title': 'Node', 'type': 'object'}}, 'description': 'Generated Knowledge Graph', 'properties': {'nodes': {'description': 'List of nodes of the knowledge graph', 'items': {'$ref': '#/$defs/Node'}, 'title': 'Nodes', 'type': 'array'}, 'edges': {'description': 'List of edges of the knowledge graph', 'items': {'$ref': '#/$defs/Edge'}, 'title': 'Edges', 'type': 'array'}}, 'required': ['nodes', 'edges'], 'title': 'KnowledgeGraph', 'type': 'object'}\n", 361 "</schema><|im_end|>\n", 362 "<|im_start|>user\n", 363 "Alice loves Bob and she hates Charlie.<|im_end|>\n", 364 "<|im_start|>assistant\n", 365 "<schema>\n" 366 ] 367 } 368 ], 369 "source": [ 370 "user_prompt = \"Alice loves Bob and she hates Charlie.\"\n", 371 "prompt = generate_hermes_prompt(user_prompt)\n", 372 "print(prompt)" 373 ] 374 }, 375 { 376 "cell_type": "markdown", 377 "id": "a0608a24-7f39-4ed8-9b6b-d551d9d556a6", 378 "metadata": {}, 379 "source": [ 380 "We use `generate.regex` by passing the `regex_str` from the Pydantic class we previously defined, and call the generator with the Hermes prompt:" 381 ] 382 }, 383 { 384 "cell_type": "code", 385 "execution_count": 11, 386 "id": "97e39197-cf88-4008-8e6d-814dec90a9a8", 387 "metadata": { 388 "execution": { 389 "iopub.execute_input": "2024-08-08T18:05:04.220341Z", 390 "iopub.status.busy": "2024-08-08T18:05:04.219491Z", 391 "iopub.status.idle": "2024-08-08T18:05:07.144417Z", 392 "shell.execute_reply": "2024-08-08T18:05:07.143798Z", 393 "shell.execute_reply.started": "2024-08-08T18:05:04.220292Z" 394 } 395 }, 396 "outputs": [], 397 "source": [ 398 "generator = generate.regex(model, regex_str)\n", 399 "response = generator(prompt, max_tokens=1024, temperature=0, seed=42)" 400 ] 401 }, 402 { 403 "cell_type": "markdown", 404 "id": "19f0959a-b239-4580-9dec-fad1e3f40211", 405 "metadata": {}, 406 "source": [ 407 "We obtain the nodes and edges of the knowledge graph" 408 ] 409 }, 410 { 411 "cell_type": "code", 412 "execution_count": 12, 413 "id": "29b7a96d-49fe-4f0c-ba01-a89a2d884194", 414 "metadata": { 415 "execution": { 416 "iopub.execute_input": "2024-08-08T18:05:07.145416Z", 417 "iopub.status.busy": "2024-08-08T18:05:07.145203Z", 418 "iopub.status.idle": "2024-08-08T18:05:07.149480Z", 419 "shell.execute_reply": "2024-08-08T18:05:07.149039Z", 420 "shell.execute_reply.started": "2024-08-08T18:05:07.145395Z" 421 } 422 }, 423 "outputs": [ 424 { 425 "data": { 426 "text/plain": [ 427 "[{'id': 1, 'label': 'Alice', 'property': 'person'},\n", 428 " {'id': 2, 'label': 'Bob', 'property': 'person'},\n", 429 " {'id': 3, 'label': 'Charlie', 'property': 'person'}]" 430 ] 431 }, 432 "execution_count": 12, 433 "metadata": {}, 434 "output_type": "execute_result" 435 } 436 ], 437 "source": [ 438 "import json\n", 439 "\n", 440 "json_response = json.loads(response)\n", 441 "json_response[\"nodes\"]" 442 ] 443 }, 444 { 445 "cell_type": "code", 446 "execution_count": 13, 447 "id": "5cca56af-58d5-4efe-9988-f1e141d8e556", 448 "metadata": { 449 "execution": { 450 "iopub.execute_input": "2024-08-08T18:05:07.150178Z", 451 "iopub.status.busy": "2024-08-08T18:05:07.150023Z", 452 "iopub.status.idle": "2024-08-08T18:05:07.175270Z", 453 "shell.execute_reply": "2024-08-08T18:05:07.174648Z", 454 "shell.execute_reply.started": "2024-08-08T18:05:07.150163Z" 455 } 456 }, 457 "outputs": [ 458 { 459 "data": { 460 "text/plain": [ 461 "[{'source': 1, 'target': 2, 'label': 'love', 'property': 'relationship'},\n", 462 " {'source': 1, 'target': 3, 'label': 'hate', 'property': 'relationship'}]" 463 ] 464 }, 465 "execution_count": 13, 466 "metadata": {}, 467 "output_type": "execute_result" 468 } 469 ], 470 "source": [ 471 "json_response[\"edges\"]" 472 ] 473 }, 474 { 475 "cell_type": "markdown", 476 "id": "63d08fd4-b0f3-45cc-b5c2-19c71baf2b0f", 477 "metadata": {}, 478 "source": [ 479 "## (Optional) Visualizing the Knowledge Graph" 480 ] 481 }, 482 { 483 "cell_type": "markdown", 484 "id": "a4ce945f-2035-401e-8c8b-98b6d0de2a7e", 485 "metadata": {}, 486 "source": [ 487 "We can use the [Graphviz library](https://graphviz.readthedocs.io/en/stable/) to visualize the generated knowledge graph. For detailed installation instructions, see [here](https://graphviz.readthedocs.io/en/stable/#installation)." 488 ] 489 }, 490 { 491 "cell_type": "code", 492 "execution_count": 14, 493 "id": "580e7367-a5ea-4649-b74d-2ab36ce17459", 494 "metadata": { 495 "execution": { 496 "iopub.execute_input": "2024-08-08T18:05:07.176411Z", 497 "iopub.status.busy": "2024-08-08T18:05:07.176096Z", 498 "iopub.status.idle": "2024-08-08T18:05:07.270015Z", 499 "shell.execute_reply": "2024-08-08T18:05:07.268900Z", 500 "shell.execute_reply.started": "2024-08-08T18:05:07.176386Z" 501 } 502 }, 503 "outputs": [ 504 { 505 "data": { 506 "image/svg+xml": [ 507 "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n", 508 "<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\n", 509 " \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\n", 510 "<!-- Generated by graphviz version 2.43.0 (0)\n", 511 " -->\n", 512 "<!-- Title: %3 Pages: 1 -->\n", 513 "<svg width=\"170pt\" height=\"203pt\"\n", 514 " viewBox=\"0.00 0.00 170.00 203.00\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\n", 515 "<g id=\"graph0\" class=\"graph\" transform=\"scale(1 1) rotate(0) translate(4 199)\">\n", 516 "<title>%3</title>\n", 517 "<polygon fill=\"white\" stroke=\"transparent\" points=\"-4,4 -4,-199 166,-199 166,4 -4,4\"/>\n", 518 "<!-- 1 -->\n", 519 "<g id=\"node1\" class=\"node\">\n", 520 "<title>1</title>\n", 521 "<ellipse fill=\"none\" stroke=\"black\" cx=\"81\" cy=\"-159\" rx=\"36\" ry=\"36\"/>\n", 522 "<text text-anchor=\"middle\" x=\"81\" y=\"-155.3\" font-family=\"Times,serif\" font-size=\"14.00\">Alice</text>\n", 523 "</g>\n", 524 "<!-- 2 -->\n", 525 "<g id=\"node2\" class=\"node\">\n", 526 "<title>2</title>\n", 527 "<ellipse fill=\"none\" stroke=\"black\" cx=\"36\" cy=\"-36\" rx=\"36\" ry=\"36\"/>\n", 528 "<text text-anchor=\"middle\" x=\"36\" y=\"-32.3\" font-family=\"Times,serif\" font-size=\"14.00\">Bob</text>\n", 529 "</g>\n", 530 "<!-- 1->2 -->\n", 531 "<g id=\"edge1\" class=\"edge\">\n", 532 "<title>1->2</title>\n", 533 "<path fill=\"none\" stroke=\"black\" d=\"M68.7,-124.94C63.49,-110.93 57.36,-94.44 51.82,-79.54\"/>\n", 534 "<polygon fill=\"black\" stroke=\"black\" points=\"55.05,-78.18 48.28,-70.02 48.49,-80.62 55.05,-78.18\"/>\n", 535 "<text text-anchor=\"middle\" x=\"77\" y=\"-93.8\" font-family=\"Times,serif\" font-size=\"14.00\">love</text>\n", 536 "</g>\n", 537 "<!-- 3 -->\n", 538 "<g id=\"node3\" class=\"node\">\n", 539 "<title>3</title>\n", 540 "<ellipse fill=\"none\" stroke=\"black\" cx=\"126\" cy=\"-36\" rx=\"36\" ry=\"36\"/>\n", 541 "<text text-anchor=\"middle\" x=\"126\" y=\"-32.3\" font-family=\"Times,serif\" font-size=\"14.00\">Charlie</text>\n", 542 "</g>\n", 543 "<!-- 1->3 -->\n", 544 "<g id=\"edge2\" class=\"edge\">\n", 545 "<title>1->3</title>\n", 546 "<path fill=\"none\" stroke=\"black\" d=\"M93.3,-124.94C98.51,-110.93 104.64,-94.44 110.18,-79.54\"/>\n", 547 "<polygon fill=\"black\" stroke=\"black\" points=\"113.51,-80.62 113.72,-70.02 106.95,-78.18 113.51,-80.62\"/>\n", 548 "<text text-anchor=\"middle\" x=\"122\" y=\"-93.8\" font-family=\"Times,serif\" font-size=\"14.00\">hate</text>\n", 549 "</g>\n", 550 "</g>\n", 551 "</svg>\n" 552 ], 553 "text/plain": [ 554 "<graphviz.graphs.Digraph at 0x7858dcf2ed10>" 555 ] 556 }, 557 "execution_count": 14, 558 "metadata": {}, 559 "output_type": "execute_result" 560 } 561 ], 562 "source": [ 563 "from graphviz import Digraph\n", 564 "\n", 565 "dot = Digraph()\n", 566 "for node in json_response[\"nodes\"]:\n", 567 " dot.node(str(node[\"id\"]), node[\"label\"], shape='circle', width='1', height='1')\n", 568 "for edge in json_response[\"edges\"]:\n", 569 " dot.edge(str(edge[\"source\"]), str(edge[\"target\"]), label=edge[\"label\"])\n", 570 "\n", 571 "dot" 572 ] 573 } 574 ], 575 "metadata": { 576 "kernelspec": { 577 "display_name": "Python 3 (ipykernel)", 578 "language": "python", 579 "name": "python3" 580 }, 581 "language_info": { 582 "codemirror_mode": { 583 "name": "ipython", 584 "version": 3 585 }, 586 "file_extension": ".py", 587 "mimetype": "text/x-python", 588 "name": "python", 589 "nbconvert_exporter": "python", 590 "pygments_lexer": "ipython3", 591 "version": "3.10.12" 592 } 593 }, 594 "nbformat": 4, 595 "nbformat_minor": 5 596 }