Ideal für: Long-Context Workloads
Best LLM for Long-Context Workloads
Ranked on context window size, needle-in-a-haystack accuracy, and input price — long-context is input-token-heavy.
Aktualisiert April 2026. Top 3 diesen Monat: Qwen3.5 Plus 2026-02-15, Qwen3.5 397B A17B, MiniMax-01.