All Models

ERNIE 4.5 300B

ernie

ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.

Providers 1
Released Jun 30, 2025
Input Modalities text
Output Modalities text
Tarsk Use coding

Available Providers (1)

Provider Model ID Input Cost Output Cost Context Max Output Docs
NanoGPT baidu/ernie-4.5-300b-a47b $0.35/MTok $1.15/MTok 131.1K 16.4K

Capabilities

Reasoning
Tool Calling
Attachments
Open Weights
Structured Output