Back to Models
Baidu: ERNIE 4.5 21B A3B
chatbaidu

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

Input Price

$0.0700/1M

Output Price

$0.2800/1M

Context

120.0K tokens

Parameters

21000.0M

Features
Function Calling
Model ID: baidu/ernie-4.5-21b-a3b