Baidu (BIDU, Financial) is putting its Ernie 4.5 multimodal model family—ten MoE variants—into the open, along with industrial-grade toolkits, as it doubles down on AI framework leadership amid stiff China-market competition.
In a blog post, Baidu announced that starting June 30 it will open source Ernie 4.5—its latest large-scale multimodal lineup built on a Mixture-of-Experts architecture that boosts cross-modal understanding without sacrificing text performance.
Alongside model weights, the company is releasing resource-efficient training and inference toolkits designed for multi-hardware deployment. A Baidu spokesperson told CNBC that the rollout will be phased, ensuring developers can ramp up gradually.
Baidu was among the first Chinese firms to ship a ChatGPT-style model domestically, but faces rivals such as Alibaba's (BABA, Financial) Tongyi Qianwen, MiniMax, Baichuan and Zhipu AI. At April's developer conference, CEO Robin Li emphasized that Baidu's goal is to “empower developers to build the best applications—without having to worry about model capability, costs, or development tools.”
Open sourcing Ernie 4.5 and its toolchains signals Baidu's intent to create an ecosystem moat: the more developers adopt its frameworks, the harder it becomes for competitors to catch up. The MoE design also paves the way for cost-efficient scaling, as compute resources are only routed to experts most relevant to a given task.
With Ernie 4.5 Turbo and the reasoning-focused Ernie X1 Turbo already in play—and a pool of 30,000 homegrown Kunlun P800 chips fueling large-scale training—Baidu looks to cement its AI platform as the go-to choice for China's next generation of AI applications.