G
GLM-5
15
Open-source MoE model for bilingual English-Chinese text generation with commercial-friendly MIT license
Open Sourceopen-source
About
GLM-5 is a mixture-of-experts (MoE) architecture text generation model optimized for both English and Chinese language tasks. With 171K+ downloads and strong community adoption, it offers a commercially-viable alternative to proprietary models through its permissive MIT license. The MoE design enables efficient scaling while maintaining competitive performance across bilingual applications.
Details
| Modality | text |
| Release Date | Feb 20, 2026 |
| API Available | Yes |
| Hosting | api |
Tags
open-weightmultimodalfine-tunableself-hostedcost-effectivereasoning
Quick Info
- Organization
- ZAI
- Pricing
- Free (self-hosted)
- Free Tier
- Yes
- Popularity
- 55/100
- Updated
- Feb 20, 2026
Also in AI Models
C
Claude Opus 4.6
Opus 4.6 is Anthropic’s strongest model for coding and long-running professional...
Commercialpaid
Anthropic$5.00/1M input tokens
C
Claude Sonnet 4.6
Sonnet 4.6 is Anthropic's most capable Sonnet-class model yet, with frontier per...
Commercialpaid
Anthropic$3.00/1M input tokens
G
Gemini 3 Flash
Google DeepMind's latest fast and capable multimodal AI model
Commercialfreemium
Google DeepMindFree tier available / $0.50/1M input tokens (Flash)