All Models
Mixtral 8x22B
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
Available Providers (3)
| Provider | Model ID | Input Cost | Output Cost | Context | Max Output | Docs |
|---|---|---|---|---|---|---|
| | mistralai/mixtral-8x22b-instruct-v0.1 | $0.90/MTok | $0.90/MTok | 65.5K | 32.8K | |
| | open-mixtral-8x22b | $2.00/MTok | $6.00/MTok | 64K | 64K | |
| | mistral/mixtral-8x22b-instruct | $2.00/MTok | $6.00/MTok | 64K | 64K |
Capabilities
Reasoning
Tool Calling
Attachments
Open Weights
Structured Output