AK

dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

3.8k 511 +2/wk
GitHub
co-training deep-learning distillation distillation-model kd knowldge-distillation knowledge-distillation knowledge-transfer model-compression model-distillation teacher-student
Trend 3

Star & Fork Trend (17 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

dkozlov/awesome-knowledge-distillation has +2 stars this period . 7-day velocity: 0.2%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric awesome-knowledge-distillation mmpretrain hands-on-ml-zh pytorch-fid
Stars 3.8k 3.8k3.8k3.8k
Forks 511 1.1k1.5k527
Weekly Growth +2 +0+0+0
Language N/A PythonCSSPython
Sources 1 111
License Apache-2.0 Apache-2.0N/AApache-2.0

Capability Radar vs mmpretrain

awesome-knowledge-distillation
mmpretrain
Maintenance Activity 94

Last code push 18 days ago.

Community Engagement 67

Fork-to-star ratio: 13.3%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 43

+2 stars this period — 0.05% growth rate.

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.