ZhengZixiang/ATPapers
Worth-reading papers and related resources on attention mechanism, Transformer and pretrained language model (PLM) such as BERT. 值得一读的注意力机制、Transformer和预训练语言模型论文与相关资源集合
Star & Fork Trend (33 data points)
Multi-Source Signals
Growth Velocity
ZhengZixiang/ATPapers has +0 stars this period . Velocity data will be available after more historical data is collected.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | ATPapers | FinBERT-QA | weightgain | awesome-bert-japanese |
|---|---|---|---|---|
| Stars | 130 | 130 | 131 | 132 |
| Forks | 13 | 30 | 7 | 7 |
| Weekly Growth | +0 | +0 | +0 | +0 |
| Language | N/A | Python | Python | N/A |
| Sources | 1 | 1 | 1 | 1 |
| License | N/A | N/A | MIT | N/A |
Capability Radar vs FinBERT-QA
Last code push 1838 days ago.
Fork-to-star ratio: 10.0%. Lower fork ratio may indicate passive usage.
Issue data not yet available.
No measurable growth in the current period (first-day cold start expected).
No clear license detected — proceed with caution.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.