AL

xlite-dev/Awesome-LLM-Inference

📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉

5.1k 359 +3/wk
GitHub
awesome-llm deepseek deepseek-r1 deepseek-v3 flash-attention flash-attention-3 flash-mla llm-inference minimax-01 mla paged-attention qwen3
Trend 3

Star & Fork Trend (21 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

xlite-dev/Awesome-LLM-Inference has +3 stars this period . 7-day velocity: 0.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric Awesome-LLM-Inference xtuner copilot sparrow
Stars 5.1k 5.1k5.1k5.1k
Forks 359 413404511
Weekly Growth +3 +0+0+0
Language Python PythonTypeScriptPython
Sources 1 111
License GPL-3.0 Apache-2.0MITGPL-3.0

Capability Radar vs xtuner

Awesome-LLM-Inference
xtuner
Maintenance Activity 100

Last code push 4 days ago.

Community Engagement 35

Fork-to-star ratio: 7.0%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 44

+3 stars this period — 0.06% growth rate.

License Clarity 70

Licensed under GPL-3.0. Copyleft — check compatibility requirements.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.