TU
microsoft/Tutel
Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4
980 107 +0/wk
GitHub
deepseek llm mixture-of-experts moe pytorch
Trend
0
Star & Fork Trend (17 data points)
Stars
Forks
Multi-Source Signals
Growth Velocity
microsoft/Tutel has +0 stars this period . 7-day velocity: -0.1%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | Tutel | cali | torch-rechub | LLM-Blender |
|---|---|---|---|---|
| Stars | 980 | 980 | 981 | 978 |
| Forks | 107 | 54 | 135 | 89 |
| Weekly Growth | +0 | +0 | +4 | +0 |
| Language | C | TypeScript | Jupyter Notebook | Python |
| Sources | 1 | 1 | 1 | 1 |
| License | MIT | MIT | MIT | Apache-2.0 |
Capability Radar vs cali
Tutel
cali
Maintenance Activity 97
Last code push 12 days ago.
Community Engagement 55
Fork-to-star ratio: 10.9%. Active community forking and contributing.
Issue Burden 70
Issue data not yet available.
Growth Momentum 30
No measurable growth in the current period (first-day cold start expected).
License Clarity 95
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.