โ† Back to all models
๐Ÿ’ฌ

Mistral: Mixtral 8x7B Instruct

MistralยทText Generation
๐Ÿ”ฅ 63trending

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe

#text->text#top-provider
๐Ÿงฎ

Undisclosed

Parameters

๐Ÿ“

33K tokens

Context Window

๐Ÿ”’

Proprietary

License

๐Ÿ“…

Dec 10, 2023

Released

๐Ÿ’ฐ Pricing

Input

$0.54

per 1M tokens

Output

$0.54

per 1M tokens

๐Ÿ”Œ

API Available

This model is accessible via API for integration into your applications.