Want your AI tool to stand out? Join 200+ tools already promoted!Promote Now

Mixtral 8x22B

A high-performance sparse Mixture-of-Experts (SMoE) model by Mistral AI, setting a new standard for open-weight models.

Tags

Mistral AI
Open Source
MoE
Free
Visit Website

Check out the official website for Mixtral 8x22B.

Visit Mixtral 8x22B