CVE-2026-22778.yaml
1 info: 2 name: vllm 3 cve: CVE-2026-22778 4 summary: A chain of vulnerabilities in vLLM, including an information leak and a heap overflow, allows for Remote Code Execution (RCE) when processing malicious video URLs. 5 details: | 6 The vulnerability chain in vLLM enables Remote Code Execution (RCE) through two primary flaws: 7 1. **Information Leak (ASLR Bypass):** Sending an invalid image to vLLM's multimodal endpoint causes PIL to throw an error, which is returned to the client and leaks a heap address. This significantly reduces the effectiveness of ASLR. 8 2. **Heap Overflow:** vLLM uses OpenCV (cv2) to decode videos, which bundles FFmpeg 5.1.x. The JPEG2000 decoder within FFmpeg contains a heap overflow vulnerability. A malicious `cdef` box in a JPEG2000 frame can remap the Y (luma) channel into the smaller U (chroma) buffer, leading to a heap overflow. This overflow can be exploited to overwrite an `AVBuffer` structure's `free()` function pointer with `system()`, allowing arbitrary command execution when the buffer is freed. 9 This exploit is feasible through the `/v1/chat/completions` and `/v1/invocations` endpoints when a `video_url` is provided. Deployments not serving a video model are not affected. 10 cvss: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H 11 severity: CRITICAL 12 security_advise: | 13 1. Upgrade vLLM to version `0.14.1` or later. 14 2. Ensure that only trusted video sources are processed if upgrading is not immediately possible. 15 3. Review and implement authentication mechanisms for vLLM instances, although this exploit can bypass pre-auth in some configurations. 16 rule: version >= "0.8.3" && version < "0.14.1" 17 references: 18 - https://github.com/vllm-project/vllm/security/advisories/GHSA-4r2x-xpjr-7cvv 19 - https://nvd.nist.gov/vuln/detail/CVE-2026-22778 20 - https://github.com/vllm-project/vllm/pull/31987 21 - https://github.com/vllm-project/vllm/pull/32319 22 - https://github.com/vllm-project/vllm/pull/32668 23 - https://github.com/vllm-project/vllm 24 - https://github.com/vllm-project/vllm/releases/tag/v0.14.1