โ† Back to all models
๐Ÿ’ฌ

Baidu: ERNIE 4.5 21B A3B

baiduยทText Generation
๐Ÿ”ฅ 67trending

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

#text->text#top-provider
๐Ÿงฎ

Undisclosed

Parameters

๐Ÿ“

120K tokens

Context Window

๐Ÿ”’

Proprietary

License

๐Ÿ“…

Aug 12, 2025

Released

๐Ÿ’ฐ Pricing

Input

$0.07

per 1M tokens

Output

$0.28

per 1M tokens

๐Ÿ”Œ

API Available

This model is accessible via API for integration into your applications.