LL

microsoft/LLMLingua

[EMNLP'23, ACL'24] To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.

6.0k 357 +3/wk
GitHub
Trend 3

Star & Fork Trend (19 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

microsoft/LLMLingua has +3 stars this period . 7-day velocity: 0.2%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

No comparable projects found in the same topic categories.

Maintenance Activity 100

Last code push 2 days ago.

Community Engagement 30

Fork-to-star ratio: 6.0%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 43

+3 stars this period — 0.05% growth rate.

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.