BO

JetRunner/BERT-of-Theseus

⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).

315 39 +0/wk
GitHub
bert glue model-compression nlp transformers
Trend 0

Star & Fork Trend (18 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

JetRunner/BERT-of-Theseus has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric BERT-of-Theseus LLMPapers bert_distill naturalcc
Stars 315 315316318
Forks 39 258259
Weekly Growth +0 +0+0+0
Language Python TeXPythonPython
Sources 1 111
License Apache-2.0 N/AN/AMIT

Capability Radar vs LLMPapers

BERT-of-Theseus
LLMPapers
Maintenance Activity 0

Last code push 1031 days ago.

Community Engagement 62

Fork-to-star ratio: 12.4%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.