Skip to content

Model family

GLM (family) / ChatGLM

GLM (family)

Zhipu AI's GLM (General Language Model) family — including the open-source ChatGLM line and the commercial GLM-4 — strong on bilingual Chinese-English work.

GLM (General Language Model) is the LLM family from Zhipu AI, originally developed at Tsinghua University's Knowledge Engineering Group. The line includes ChatGLM-6B (early 2023, one of the earliest widely-adopted Chinese open-source LLMs), ChatGLM2/3, GLM-4 (commercial frontier), CogVLM (multimodal vision), and CogView (image generation). It matters because ChatGLM-6B was a milestone for the Chinese open-source LLM scene — released when frontier US models were largely unavailable in China, it gave Chinese developers a usable, locally-runnable bilingual chat model and was widely fine-tuned and integrated into Chinese products. The commercial GLM-4 has remained a strong contender on Chinese benchmarks like C-Eval and CMMLU. A distinguishing feature: GLM models are designed for genuinely bilingual training, not English-first with Chinese added. This shows up in nuance handling — GLM tends to be strong at Chinese idiomatic expression, formal Chinese writing, and code-switching scenarios common in Chinese professional contexts. GLM is closely tied to Tsinghua academia, with regular research paper output. Available via Zhipu's API, the Qingyan (清言) consumer chat product, and (for older versions) open weights on Hugging Face and ModelScope. Related: Zhipu AI, ChatGLM, Tsinghua, Chinese AI, Qwen.

Last updated: 2026-04-29

We use cookies

Anonymous analytics help us improve the site. You can opt out anytime. Learn more