Mistral Codestral-Mamba: State-Space Model Achieves SOTA Efficiency on Long-Code Completion

76    2026-02-19

[AI-NEWS-ENTRY]

Date: 2026-02-19

Title: Mistral Codestral-Mamba: State-Space Model Achieves SOTA Efficiency on Long-Code Completion

Content: Codestral-Mamba (based on Mamba-2 architecture) sets new efficiency records for long-context code completion: 4.1× faster inference than equivalent Transformer models at 256k context, while matching or exceeding pass@1 on HumanEval+, MultiPL-E, and BigCodeBench-hard.

Keywords: state-space model, Mamba, long-context code, Codestral-Mamba, code completion, inference efficiency