
DeepSeek’s New Training Method: What It Means for AI Development Costs
Chinese AI startup DeepSeek kicked off 2026 with a research paper introducing a new approach to training large language models. The method, called Manifold-Constrained Hyper-Connections (mHC), aims to make AI model training more stable and efficient, potentially reducing the resources needed to develop powerful AI systems.