TO

yoshitomo-matsubara/torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

1.6k 144 +0/wk
GitHub
amazon-sagemaker-lab cifar10 cifar100 coco colab-notebook glue google-colab image-classification imagenet knowledge-distillation natural-language-processing nlp
Trend 0

Star & Fork Trend (22 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

yoshitomo-matsubara/torchdistill has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric torchdistill delta fast-autoaugment Face-Mask-Detection
Stars 1.6k 1.6k1.6k1.6k
Forks 144 284197863
Weekly Growth +0 +0+0+0
Language Python PythonPythonJupyter Notebook
Sources 1 111
License MIT Apache-2.0MITMIT

Capability Radar vs delta

torchdistill
delta
Maintenance Activity 99

Last code push 9 days ago.

Community Engagement 45

Fork-to-star ratio: 9.0%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.