Back to Models
Mistral: Mixtral 8x7B Instruct
chatmistralai

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe

Input Price

$0.5400/1M

Output Price

$0.5400/1M

Context

32.8K tokens

Parameters

56000.0M

Features
Function Calling
Model ID: mistralai/mixtral-8x7b-instruct