DL

b4rtaz/distributed-llama

Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.

2.9k 220 +1/wk
GitHub
distributed-computing distributed-llm llama2 llama3 llm llm-inference llms neural-network open-llm
Trend 3

Star & Fork Trend (42 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

b4rtaz/distributed-llama has +1 stars this period . 7-day velocity: 0.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric distributed-llama genaiscript neuralcoref thinc
Stars 2.9k 2.9k2.9k2.9k
Forks 220 224471294
Weekly Growth +1 -1+0+0
Language C++ TypeScriptCPython
Sources 1 111
License MIT MITMITMIT

Capability Radar vs genaiscript

distributed-llama
genaiscript
Maintenance Activity 71

Last code push 57 days ago.

Community Engagement 38

Fork-to-star ratio: 7.6%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 42

+1 stars this period — 0.03% growth rate.

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.