Want your AI tool to stand out? Join 200+ tools already promoted!Promote Now

DeepSeek-V2

An open-source, Mixture-of-Experts (MoE) language model with 236B parameters, known for its strong performance and efficiency.

Tags

DeepSeek
Open Source
MoE
Free
Visit Website

Check out the official website for DeepSeek-V2.

Visit DeepSeek-V2