vLLM is an inference and serving engine for large language models (LLMs). The SSRF protection fix for CVE-2026-24779 add in 0.15.1 can be bypassed in the load_from_url_async method due to inconsistent URL parsing behavior between the validation layer and the actual HTTP client. The SSRF fix uses urllib3.util.parse_url() to validate and extract the hostname from user-provided URLs. However, load_from_url_async uses aiohttp for making the actual HTTP requests, and aiohttp internally uses the yarl library for URL parsing. This vulnerability in 0.17.0.
References
| Link | Resource |
|---|---|
| https://github.com/vllm-project/vllm/commit/6f3b2047abd4a748e3db4a68543f8221358002c0 | Patch |
| https://github.com/vllm-project/vllm/pull/34743 | Issue Tracking Patch |
| https://github.com/vllm-project/vllm/security/advisories/GHSA-qh4c-xf7m-gxfc | Not Applicable |
| https://github.com/vllm-project/vllm/security/advisories/GHSA-v359-jj2v-j536 | Exploit Patch Vendor Advisory |
Configurations
History
18 Mar 2026, 18:36
| Type | Values Removed | Values Added |
|---|---|---|
| CPE | cpe:2.3:a:vllm:vllm:*:*:*:*:*:*:*:* | |
| First Time |
Vllm
Vllm vllm |
|
| References | () https://github.com/vllm-project/vllm/commit/6f3b2047abd4a748e3db4a68543f8221358002c0 - Patch | |
| References | () https://github.com/vllm-project/vllm/pull/34743 - Issue Tracking, Patch | |
| References | () https://github.com/vllm-project/vllm/security/advisories/GHSA-qh4c-xf7m-gxfc - Not Applicable | |
| References | () https://github.com/vllm-project/vllm/security/advisories/GHSA-v359-jj2v-j536 - Exploit, Patch, Vendor Advisory |
11 Mar 2026, 13:53
| Type | Values Removed | Values Added |
|---|---|---|
| Summary |
|
09 Mar 2026, 21:16
| Type | Values Removed | Values Added |
|---|---|---|
| New CVE |
Information
Published : 2026-03-09 21:16
Updated : 2026-03-18 18:36
NVD link : CVE-2026-25960
Mitre link : CVE-2026-25960
CVE.ORG link : CVE-2026-25960
JSON object : View
Products Affected
vllm
- vllm
CWE
CWE-918
Server-Side Request Forgery (SSRF)
