AK

Tebmer/Awesome-Knowledge-Distillation-of-LLMs

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

1.3k 73 +0/wk
GitHub
alignment compression data-augmentation data-synthesis feedback instruction-following kd knowledge-distillation large-language-model llm multi-modal self-distillation
Trend 0

Star & Fork Trend (29 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

Tebmer/Awesome-Knowledge-Distillation-of-LLMs has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric Awesome-Knowledge-Distillation-of-LLMs sage ai-agent-papers npcpy
Stars 1.3k 1.3k1.3k1.3k
Forks 73 1159492
Weekly Growth +0 +0+0+0
Language N/A PythonN/APython
Sources 1 111
License N/A Apache-2.0N/AMIT

Capability Radar vs sage

Awesome-Knowledge-Distillation-of-LLMs
sage
Maintenance Activity 0

Last code push 396 days ago.

Community Engagement 29

Fork-to-star ratio: 5.7%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 30

No clear license detected — proceed with caution.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.