Squeezing More Memory from Language Models
A new approach to compressing key-value caches boosts performance by exploiting the inherent predictability of sequential data.
A new approach to compressing key-value caches boosts performance by exploiting the inherent predictability of sequential data.
New research explores a streamlined authentication method for resource-limited IoT devices using unique silicon characteristics.
Researchers have dramatically reduced the memory requirements of the HAETAE signature scheme, paving the way for secure communication on resource-limited microcontrollers.

As AI agents take the reins of financial transactions, understanding and mitigating their unique security vulnerabilities is paramount.
New research explores how strategically assigning values to a small set of variables can dramatically simplify the problem of solving Quantified Boolean Formulas.
New research delves into how surface imperfections and temperature affect the subtle quantum forces between closely spaced objects.
A new approach leverages graph-based analysis to improve the accuracy of 4D radar scan registration, even in environments lacking distinct features.
![The system defines a margin of tolerance ρ allowing for misclassifications near the decision boundary, and constructs a hypothesis set [latex]\mathcal{H}[/latex] - shaded to indicate viable solutions - alongside a subset [latex]\mathcal{H}_{+}\[/latex] correctly classifying positive examples, where positive and negative instances are represented as vectors imposing constraints on the resulting classification half-spaces.](https://arxiv.org/html/2604.14614v1/Hcone.png)
A new algorithm significantly improves the efficiency of learning intersections of halfspaces, bringing us closer to optimal performance.
Researchers have developed a highly efficient protocol for fuzzy private set intersection, enabling secure data analysis without revealing exact matches.
![The Jacobi-kernel Support Vector Machine, applied to a double-spiral dataset with parameters [latex]\alpha = \beta = 0[/latex] and [latex]C = 1[/latex], demonstrates increasingly refined decision boundaries as the truncation level <i>n</i> varies from 1 to 16, evidenced by the evolving zero level set which accurately separates the two classes within the data.](https://arxiv.org/html/2604.15285v1/varying_n_alpha_beta0.png)
A new framework reveals the inner workings of Support Vector Machines, offering a structured understanding of how polynomial kernels contribute to classification decisions.