vLLM is an inference and serving engine for large language models (LLMs). Version 0.8.0 up to but excluding 0.9.0 have a Denial of Service (ReDoS) that causes the vLLM server to crash if an invalid regex was provided while using structured output. This vulnerability is similar to GHSA-6qc9-v4r8-22xg/CVE-2025-48942, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
References
Configurations
No configuration.
History
30 May 2025, 19:15
Type | Values Removed | Values Added |
---|---|---|
New CVE |
Information
Published : 2025-05-30 19:15
Updated : 2025-06-02 17:32
NVD link : CVE-2025-48943
Mitre link : CVE-2025-48943
CVE.ORG link : CVE-2025-48943
JSON object : View
Products Affected
No product.
CWE
CWE-248
Uncaught Exception