TU

Tencent/TurboTransformers

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

1.5k 207 +0/wk
GitHub
albert bert decoder gpt2 gpu huggingface-transformers inference machine-translation nlp pytorch roberta transformer
Trend 0

Star & Fork Trend (45 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

Tencent/TurboTransformers has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric TurboTransformers OpenAdapt PageLM bi-att-flow
Stars 1.5k 1.5k1.5k1.5k
Forks 207 228211672
Weekly Growth +0 -2-1+0
Language C++ PythonTypeScriptPython
Sources 1 111
License NOASSERTION MITNOASSERTIONApache-2.0

Capability Radar vs OpenAdapt

TurboTransformers
OpenAdapt
Maintenance Activity 0

Last code push 265 days ago.

Community Engagement 67

Fork-to-star ratio: 13.4%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 30

No clear license detected — proceed with caution.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.