Mistral AI / Mixtral 8x22B
Released: 4/17/2024texttext
Input: $2.00 / Output: $6.00
Mixtral-8x22B is an LLM that excels in multilingual support across English, French, Italian, German, and Spanish, as well as strong capabilities in mathematics and coding tasks. It utilizes a Sparse Mixture of Experts architecture, offering efficient performance with 39 billion active parameters out of 141 billion.
Some noteworthy use cases of Mixtral-8x22B include handling multilingual tasks and performing complex mathematical computations.
Metric | Value |
---|---|
Parameter Count | 141 billion |
Mixture of Experts | Yes |
Active Parameter Count | 39 billion |
Context Length | 64,000 tokens |
Multilingual | Yes |
Quantized* | Unknown |
*Quantization is specific to the inference provider and the model may be offered with different quantization levels by other providers.
Mistral AI models available on Oxen.ai
Modality | Price (1M tokens) | ||||
---|---|---|---|---|---|
Model | Inference provider | Input | Output | Input | Output |
text | text | $0.20 | $0.60 | ||
text | text | $0.30 | $0.90 | ||
text | text | $0.04 | $0.04 | ||
text | text | $0.10 | $0.10 | ||
text | text | $0.25 | $0.25 | ||
text | text | $2.00 | $6.00 | ||
text | text | $0.15 | $0.15 | ||
text | text | $0.20 | $0.60 | ||
text | text | $2.00 | $6.00 | ||
text | text | $0.70 | $0.70 | ||
text | text | $0.15 | $0.15 |