RM
lucidrains/recurrent-memory-transformer-pytorch
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
422 19 +0/wk
GitHub
artificial-intelligence attention-mechanisms deep-learning long-context memory recurrence transformers
Trend
0
Star & Fork Trend (21 data points)
Stars
Forks
Multi-Source Signals
Growth Velocity
lucidrains/recurrent-memory-transformer-pytorch has +0 stars this period . Velocity data will be available after more historical data is collected.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | recurrent-memory-transformer-pytorch | RL-Chatbot | ResourceBank_CV_NLP_MLOPS_2022 | online-continual-learning |
|---|---|---|---|---|
| Stars | 422 | 422 | 422 | 421 |
| Forks | 19 | 140 | 92 | 61 |
| Weekly Growth | +0 | +0 | +0 | +0 |
| Language | Python | Python | Jupyter Notebook | Python |
| Sources | 1 | 1 | 1 | 1 |
| License | MIT | MIT | N/A | N/A |
Capability Radar vs RL-Chatbot
recurrent-memory-transformer-pytorch
RL-Chatbot
Maintenance Activity 0
Last code push 457 days ago.
Community Engagement 69
Fork-to-star ratio: 4.5%. Lower fork ratio may indicate passive usage.
Issue Burden 70
Issue data not yet available.
Growth Momentum 30
No measurable growth in the current period (first-day cold start expected).
License Clarity 95
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.