Baidu: ERNIE 4.5 21B A3B
A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.
Undisclosed
Parameters
120K tokens
Context Window
Proprietary
License
Aug 12, 2025
Released
๐ฐ Pricing
Input
$0.07
per 1M tokens
Output
$0.28
per 1M tokens
API Available
This model is accessible via API for integration into your applications.
โญ Related Models
Claude 3.5 Sonnet
Anthropic
The model that defined a generation. Fast, smart, and incredibly capable across coding, analysis, and creative tasks.
Claude 3.5 Haiku
Anthropic
Ultra-fast and cost-effective. Best for high-volume tasks where speed matters more than peak intelligence.
GPT-4o Mini
OpenAI
Compact and affordable. Surprisingly capable for its price point, ideal for high-volume applications.