DeepSeek V4 ships open weights — frontier reasoning at MoE serving cost
DeepSeek released V4 model weights under their permissive commercial-friendly license. The model is a Mixture-of-Experts at trillion-parameter total scale with ~37-50B active parameters per token. Capability lands within 20 ELO points of GPT-5 mini on reasoning benchmarks (AIME, …