Llama 4 Maverick
Open SourceMeta•Released on 2025-04-05
Meta's flagship open-source multimodal model. 17B active parameters with 400B total (128 expert MoE). 1M context window, natively multimodal with early fusion. Extremely cost-effective at $0.15/$0.60 per M tokens. Supports 12 languages.
80
Overall Score
Core Specs
1049K
Context Window
16K
Max Output
✗
Reasoning
✓
Open Source
Multimodal Support
textimage
Scenario Scores
User Feedback Highlights
Based on community feedback. Hover to see original reviews.
− Knowledge cutoff August 2024− Benchmark gaming controversy+ Extremely affordable ($0.15/$0.60)+ 1M context window− 16K max output limit+ Native multimodal (text + image)− Coding performance below Claude/GPT+ Open source (Llama 4 Community License)+ High throughput MoE architecture
Sentiment:👍 65%😐 20%👎 15%
Pros & Cons
Pros
- +Extremely affordable ($0.15/$0.60)
- +1M context window
- +Native multimodal (text + image)
- +Open source (Llama 4 Community License)
- +High throughput MoE architecture
Cons
- −Coding performance below Claude/GPT
- −Benchmark gaming controversy
- −16K max output limit
- −Knowledge cutoff August 2024
Reliability
Incidents (30d)0
开源模型,可靠性取决于托管方
Pricing
Input (per 1M tokens)$0.15
Output (per 1M tokens)$0.60
Free trial available
Updated on 2026-03-07
Compare with Others
Benchmarks
mmmu73.4%
codingAccuracy70%
lmArenaElo1417%