vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.6.5 and prior to 0.8.5, having vLLM integration with mooncake, are vulnerable to remote code execution due to using pickle based serialization over unsecured ZeroMQ sockets. The vulnerable sockets were set to listen on all network interfaces, increasing the likelihood that an attacker is able to reach the vulnerable ZeroMQ sockets to carry out an attack. vLLM instances that do not make use of the mooncake integration are not vulnerable. This issue has been patched in version 0.8.5.
References
Configurations
No configuration.
History
30 Apr 2025, 01:15
Type | Values Removed | Values Added |
---|---|---|
New CVE |
Information
Published : 2025-04-30 01:15
Updated : 2025-04-30 01:15
NVD link : CVE-2025-32444
Mitre link : CVE-2025-32444
CVE.ORG link : CVE-2025-32444
JSON object : View
Products Affected
No product.
CWE
CWE-502
Deserialization of Untrusted Data