SP

galilai-group/stable-pretraining

Reliable, minimal and scalable library for pretraining foundation and world models

181 34 +1/wk
GitHub
computer-vision computer-vision-algorithms contrastive-learning deep-learning distributed foundation-models joint-embedding joint-embedding-predictive-architecture large-language-model multimodal-learning pytorch self-supervised-learning
Trend 3

Star & Fork Trend (20 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

galilai-group/stable-pretraining has +1 stars this period . 7-day velocity: 1.7%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric stable-pretraining ngc-learn Computer-Vision-Projects multihead-siamese-nets
Stars 181 181180183
Forks 34 343743
Weekly Growth +1 +0+0+0
Language Python PythonJupyter NotebookJupyter Notebook
Sources 1 111
License MIT BSD-3-ClauseN/AMIT

Capability Radar vs ngc-learn

stable-pretraining
ngc-learn
Maintenance Activity 100

Last code push 1 days ago.

Community Engagement 94

Fork-to-star ratio: 18.8%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 73

+1 stars this period — 0.55% growth rate.

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.