Pune

Baidu Open-Sources Ernie 4.5 AI Models and Toolkit for Developers

Baidu Open-Sources Ernie 4.5 AI Models and Toolkit for Developers

Baidu has open-sourced 10 variants of its Ernie 4.5 AI models, along with a multi-hardware toolkit, enabling developers to easily train and customize the models.

Baidu: The global competition in the field of Artificial Intelligence (AI) is rapidly intensifying, and in this context, Baidu, a leading Chinese tech company, has taken a major step by making 10 variants of its latest AI model, the Ernie 4.5 series, available to the open-source community.

This launch is not only significant from a technical perspective, but it is also considered an important initiative towards democratizing and making AI innovation accessible. Furthermore, the company has released a multi-hardware supported toolkit, 'ErnieKit,' which will provide developers with greater freedom and flexibility in model training and customization.

What is the Ernie 4.5 Series?

Ernie (Enhanced Representation through Knowledge Integration) is Baidu's flagship generative AI model, based on technologies similar to GPT. Ernie 4.5 is its advanced version, specifically designed with a Mixture-of-Experts (MoE) architecture.

Baidu had previously announced that it would open-source Ernie 4.5 on July 31, 2025, but the company made it public ahead of schedule, on July 1, 2025, creating a wave of excitement in the global AI community.

Key Features of the Released Models

The 10 variants of Ernie 4.5 are designed with various capabilities and sizes:

  • 8 models are based on MoE (Mixture-of-Experts) technology.
  • 4 are multimodal models that can process both vision and language.
  • 2 models are designed for reasoning/thinking capabilities.
  • 5 models are post-trend, while others are at the pre-trend stage.

All these models have been launched under the Apache 2.0 license, which allows both academic and commercial use.

Technical Strengths: From Parameters to Performance

Baidu has also shared in-depth technical information about these models. According to the company:

  • MoE models have a total of 47 billion parameters, but only 3 billion parameters are active at any given time, maintaining a balance between performance and efficiency.
  • The largest model has 424 billion parameters, which places the Ernie 4.5 series in the category of mega-scale models.

Baidu claims that its model Ernie-4.5-300B-A47B-Base outperforms DeepSeek-V3-671B-A37B-Base, a benchmarking platform, in 22 out of 28 tests.

In another comparative analysis, the Ernie-4.5-21B-A3B-Base model, despite having fewer parameters, performs better than Qwen3-30B-A3B-Base, especially in areas like mathematics and logical reasoning.

Special Training Strategy for Models

Baidu has trained these models on its own PaddlePaddle framework. During training, several modern techniques were used:

  • Asymmetric MoE structure
  • Intra-node expert parallelism
  • Memory-efficient pipeline scheduling
  • FP8 mixed-precision training
  • Fine-grained re-clustering algorithm

ErnieKit: An All-in-One Toolkit for Developers

Not just the models, Baidu has also open-sourced a toolkit called ErnieKit for developers. Through this toolkit, users can perform the following tasks:

  • Pre-training and SFT (Supervised Fine-Tuning)
  • LoRA (Low-Rank Adaptation)
  • Custom training and optional scaling methods

A key feature of ErnieKit is that it supports multi-hardware configurations, making it suitable for everything from low-end to high-end server clusters.

Benefits for Global Developers and Researchers

This step by Baidu will prove to be extremely useful, particularly for startups, academics, researchers, and independent developers. Under the Apache 2.0 license, they get:

  • Complete customization freedom
  • Facility for experimentation in academic projects
  • Exemption from commercial application development without license restrictions.

Leave a comment