DeepYard
L

Llama 4 Scout

28

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model de...

Open Sourcepaid

About

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input (text and image) and multilingual output (text and code) across 12 supported languages. Designed for assistant-style interaction and visual reasoning, Scout uses 16 experts per forward pass and features a context length of 10 million tokens, with a training corpus of ~40 trillion tokens. Built for high effic

Details

Modalityimage, text
Context Window327.7K tokens
Release DateApr 5, 2025
API AvailableYes
Hostingself-hosted, api
Output Speed250 tokens/sec
Time to First Token250ms
Quality Index68/100
Coding Index62/100
Reasoning Index66/100

Benchmarks

Chatbot Arena ELO
1340/1500
MMLU-Pro
74.8/100
SWE-bench Verified
35/100
MATH-500
75/100
GPQA Diamond
55/100
HumanEval
87.5/100

Tags

multimodalopen-weightapichat