PE

bigscience-workshop/petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading

10.1k 598 +3/wk
GitHub
bloom chatbot deep-learning distributed-systems falcon gpt guanaco language-models large-language-models llama machine-learning mixtral
Trend 3

Star & Fork Trend (37 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

bigscience-workshop/petals has +3 stars this period . 7-day velocity: 0.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric petals tpot CoreNLP llm-engineer-toolkit
Stars 10.1k 10.0k10.1k10.1k
Forks 598 1.6k2.7k1.6k
Weekly Growth +3 +0+0+4
Language Python Jupyter NotebookJavaN/A
Sources 1 111
License MIT LGPL-3.0GPL-3.0Apache-2.0

Capability Radar vs tpot

petals
tpot
Maintenance Activity 0

Last code push 578 days ago.

Community Engagement 30

Fork-to-star ratio: 5.9%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 42

+3 stars this period — 0.03% growth rate.

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.