XT

InternLM/xtuner

A Next-Generation Training Engine Built for Ultra-Large MoE Models

5.1k 413 +0/wk
GitHub
agent deepseek-v3 gpt-oss intern-s1 internvl kimi-k2 llm multimodal qwen3-moe qwen3-vl reinforcement-learning
Trend 3

Star & Fork Trend (52 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

InternLM/xtuner has +0 stars this period . 7-day velocity: 0.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric xtuner copilot read-frog Awesome-LLM-Inference
Stars 5.1k 5.1k5.1k5.1k
Forks 413 404332359
Weekly Growth +0 +0+10+2
Language Python TypeScriptTypeScriptPython
Sources 1 111
License Apache-2.0 MITGPL-3.0GPL-3.0

Capability Radar vs copilot

xtuner
copilot
Maintenance Activity 100

Last code push 1 days ago.

Community Engagement 40

Fork-to-star ratio: 8.1%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.