Blog

"Prevention is cheaper than a breach"

Live Vulnerability Intelligence

Threat Database

Search CVEs, inspect descriptions, and open detail pages with AI-assisted technical context.

Total1
Critical0
High1
Medium0
Reset
Showing 1-1 of 1 records
Threat Entry Updated 2026-02-02

CVE-2026-21869 - llama.cpp Plugin

llama.cpp is an inference of several LLM models in C/C++. In commits 55d4206c8 and prior, the n_discard parameter is parsed directly from JSON input in the llama.cpp server's completion endpoints without validation to ensure it's non-negative. When a negative value is supplied and the context fills up, llama_memory_seq_rm/add receives a reversed range and negative offset, causing out-of-bounds memory writes in the token evaluation loop. This deterministic memory corruption can crash the process or enable remote code execution (RCE). There is no fix at the time of publication.

PLUGIN llama.cpp

CVE-2026-21869

HIGH CVSS 8.8 2026-01-08
Scroll to top