Baidu Releases Ernie 45 Series AI Models in Open-Source Offers Multi-Hardware Toolkits

Baidu Releases Ernie 45 Series AI Models in Open-Source Offers Multi-Hardware Toolkits

Baidu just dropped a bombshell for the open-source AI community: the Ernie 4.5 series is here, and it’s completely free to use. Accelerating their initial promise, the Chinese tech titan unleashed not one, but ten different flavors of these cutting-edge AI models built on the innovative Mixture-of-Experts (MoE) architecture. But that’s not all; Baidu’s also throwing in multi-hardware development toolkits to empower creators to push Ernie 4.5 to its absolute limits. The AI revolution just got a serious upgrade.

Baidu Releases 10 Variants of Ernie 4.5 AI Models in Open Source

Baidu just unleashed a horde of open-source Ernie 4.5 AI models onto the world via X, Hugging Face, and GitHub! This isn’t just a release; it’s an AI playground. Dig into ten powerful models, including mind-bending multimodal vision-language beasts, efficiency-optimized MoE powerhouses, and brainy reasoning models. Five are freshly post-trained, the rest eager pre-trained minds. Download, experiment, and build the future.

Baidu’s cooking up some serious AI heat! Their new Mixture-of-Experts (MoE) models boast a whopping 47 billion parameters, but here’s the kicker: only 3 billion are firing at any given moment. Think of it like a hyper-specialized AI dream team, ready to tackle any challenge. And if that’s not impressive enough, the biggest beast in their 10-model lineup flexes a staggering 424 billion parameters, all forged in the fires of the PaddlePaddle deep learning framework.

Ernie 4.5 is making waves. Internal tests reveal the Ernie-4.5-300B-A47B-Base model dominates DeepSeek-V3-671B-A37B-Base, conquering 22 out of 28 benchmarks. But the real kicker? The remarkably efficient Ernie-4.5-21B-A3B-Base, sporting 30% fewer parameters, outmaneuvers Qwen3-30B-A3B-Base in the critical arenas of mathematics and reasoning. The future of AI is powerful, and apparently, smaller.

Baidu pulled back the curtain on its Ernie Bot’s training secrets, showcasing a mastery of AI wizardry. Imagine a digital forge where algorithms are hammered into shape using a cutting-edge “heterogeneous MoE structure.” To build these colossal models, Baidu deployed a suite of techniques: orchestrating “intra-node expert parallelism” like a conductor leading a digital orchestra, employing “memory-efficient pipeline scheduling” to squeeze every ounce of performance, embracing “FP8 mixed-precision training” for lightning-fast calculations, and utilizing a “fine-grained recomputation method” to refine the model’s intelligence with laser-like precision.

Baidu isn’t just showcasing new models; they’re handing over the keys to the kingdom. ErnieKit, a comprehensive development toolkit for the Ernie 4.5 series, is now open to the community. Imagine a playground where you can pre-train, fine-tune (SFT), and implement Low-Rank Adaptation (LoRA), all with the freedom to customize. Best part? These powerful tools are licensed under Apache 2.0, opening the door for both researchers and entrepreneurs to harness the potential of Ernie.

Thanks for reading Baidu Releases Ernie 45 Series AI Models in Open-Source Offers Multi-Hardware Toolkits

MightNews
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.