FL
fla-org/flash-linear-attention
🚀 Efficient implementations for emerging model architectures
4.8k 484 +7/wk
GitHub
large-language-models machine-learning-systems natural-language-processing sequence-modeling
Trend
3
Star & Fork Trend (18 data points)
Stars
Forks
Multi-Source Signals
Growth Velocity
fla-org/flash-linear-attention has +7 stars this period . 7-day velocity: 0.3%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | flash-linear-attention | EasyR1 | Yuxi | prompt-master |
|---|---|---|---|---|
| Stars | 4.8k | 4.8k | 4.8k | 4.8k |
| Forks | 484 | 367 | 659 | 469 |
| Weekly Growth | +7 | +4 | +7 | +30 |
| Language | Python | Python | Python | N/A |
| Sources | 1 | 1 | 1 | 1 |
| License | MIT | Apache-2.0 | MIT | MIT |
Capability Radar vs EasyR1
flash-linear-attention
EasyR1
Maintenance Activity 100
Last code push 1 days ago.
Community Engagement 50
Fork-to-star ratio: 10.0%. Active community forking and contributing.
Issue Burden 70
Issue data not yet available.
Growth Momentum 49
+7 stars this period — 0.14% growth rate.
License Clarity 95
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.