DeepYard
G

GLM-5

15

Open-source MoE model for bilingual English-Chinese text generation with commercial-friendly MIT license

Open Sourceopen-source

About

GLM-5 is a mixture-of-experts (MoE) architecture text generation model optimized for both English and Chinese language tasks. With 171K+ downloads and strong community adoption, it offers a commercially-viable alternative to proprietary models through its permissive MIT license. The MoE design enables efficient scaling while maintaining competitive performance across bilingual applications.

Details

Modalitytext
Release DateFeb 20, 2026
API AvailableYes
Hostingapi

Tags

open-weightmultimodalfine-tunableself-hostedcost-effectivereasoning