MO

lucidrains/mixture-of-experts

A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models

849 70 +0/wk
GitHub
artificial-intelligence deep-learning mixture-of-experts transformer
Trend 0

Star & Fork Trend (34 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

lucidrains/mixture-of-experts has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric mixture-of-experts zipnerf-pytorch pipeless emotion-recognition-neural-networks
Stars 849 849849847
Forks 70 9353305
Weekly Growth +0 +0+0+0
Language Python PythonRustPython
Sources 1 111
License MIT Apache-2.0Apache-2.0MIT

Capability Radar vs zipnerf-pytorch

mixture-of-experts
zipnerf-pytorch
Maintenance Activity 0

Last code push 939 days ago.

Community Engagement 41

Fork-to-star ratio: 8.2%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.