All Models

Meta: Llama 4 Scout

Tool Calling Attachments Open Weights

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input (text and image) and multilingual output (text and code) across 12 supported languages. Designed for assistant-style interaction and visual reasoning, Scout uses 16 experts per forward pass and features a context length of 10 million tokens, with a training corpus of ~40 trillion tokens. Built for high efficiency and local or commercial deployment, Llama 4 Scout incorporates early fusion for seamless modality integration. It is instruction-tuned for use in multilingual chat, captioning, and image understanding tasks. Released under the Llama 4 Community License, it was last trained on data up to August 2024 and launched publicly on April 5, 2025.

Providers 1
Released Apr 5, 2025
Input Modalities text, image
Output Modalities text
Tarsk Use coding

Available Providers (1)

Provider Model ID Input Cost Output Cost Context Max Output Docs
Kilo Gateway meta-llama/llama-4-scout $0.08/MTok $0.30/MTok 327.7K 16.4K

Capabilities

Reasoning
Tool Calling
Attachments
Open Weights
Structured Output