Stars
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
A curated list of neural network pruning resources.
B站Efficient-Neural-Network学习分享的配套代码
jasonyank / RepDistiller
Forked from HobbitLong/RepDistiller[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods