BP

guotong1988/BERT-pre-training

multi-gpu pre-training in one machine for BERT without horovod (Data Parallelism)

171 52 +0/wk
GitHub
bert multi-gpu nlp tensorflow
Trend 0

Star & Fork Trend (18 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

guotong1988/BERT-pre-training has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric BERT-pre-training pythorch-text-classification dspy-go EuroEval
Stars 171 171171173
Forks 52 101452
Weekly Growth +0 +0+1+1
Language Python PythonGoPython
Sources 1 111
License Apache-2.0 Apache-2.0MITMIT

Capability Radar vs pythorch-text-classification

BERT-pre-training
pythorch-text-classification
Maintenance Activity 45

Last code push 103 days ago.

Community Engagement 100

Fork-to-star ratio: 30.4%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.