yoshitomo-matsubara/torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Star & Fork Trend (22 data points)
Multi-Source Signals
Growth Velocity
yoshitomo-matsubara/torchdistill has +0 stars this period . Velocity data will be available after more historical data is collected.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | torchdistill | delta | fast-autoaugment | Face-Mask-Detection |
|---|---|---|---|---|
| Stars | 1.6k | 1.6k | 1.6k | 1.6k |
| Forks | 144 | 284 | 197 | 863 |
| Weekly Growth | +0 | +0 | +0 | +0 |
| Language | Python | Python | Python | Jupyter Notebook |
| Sources | 1 | 1 | 1 | 1 |
| License | MIT | Apache-2.0 | MIT | MIT |
Capability Radar vs delta
Last code push 9 days ago.
Fork-to-star ratio: 9.0%. Lower fork ratio may indicate passive usage.
Issue data not yet available.
No measurable growth in the current period (first-day cold start expected).
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.