All Models
Mistral: Mistral Nemo
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.
Available Providers (1)
| Provider | Model ID | Input Cost | Output Cost | Context | Max Output | Docs |
|---|---|---|---|---|---|---|
| | mistralai/mistral-nemo | $0.02/MTok | $0.04/MTok | 131.1K | 16.4K |
Capabilities
Reasoning
Tool Calling
Attachments
Open Weights
Structured Output