DeepYard
L

Llama 4 Maverick

12

Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language mode...

Open Sourcepaid

About

Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward pass (400B total). It supports multilingual text and image input, and produces multilingual text and code output across 12 supported languages. Optimized for vision-language tasks, Maverick is instruction-tuned for assistant-like behavior, image reasoning, and general-purpose multimodal interact

Details

Modalityimage, text
Context Window1.0M tokens
Release DateApr 5, 2025
API AvailableYes
Hostingself-hosted, api
Output Speed180 tokens/sec
Time to First Token350ms
Quality Index76/100
Coding Index70/100
Reasoning Index74/100

Benchmarks

Chatbot Arena ELO
1350/1500
MMLU-Pro
80.5/100
MATH-500
82/100
GPQA Diamond
62.5/100
HumanEval
90.8/100

Tags

multimodallong-contextopen-weightapichat