Back to Models
Mistral: Mixtral 8x22B Instruct
codingmistralai

Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe

Input Price

$2.00/1M

Output Price

$6.00/1M

Context

65.5K tokens

Parameters

176000.0M

Features
Function Calling
Model ID: mistralai/mixtral-8x22b-instruct