VM

waybarrios/vllm-mlx

OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.

774 176 +0/wk
GitHub
anthropic apple-silicon audio-processing claude-code computer-vision image-understanding inference llm machine-learning macos mllm mlx
Trend 3

Star & Fork Trend (52 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

waybarrios/vllm-mlx has +0 stars this period . 7-day velocity: 0.8%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric vllm-mlx jax-js QiZhenGPT InferenceX
Stars 774 774774774
Forks 176 4388121
Weekly Growth +0 +0+0+4
Language Python TypeScriptPythonPython
Sources 1 111
License N/A MITGPL-3.0Apache-2.0

Capability Radar vs jax-js

vllm-mlx
jax-js
Maintenance Activity 100

Last code push 7 days ago.

Community Engagement 100

Fork-to-star ratio: 22.7%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 30

No clear license detected — proceed with caution.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.