Local-first · BYOK · MIT

Free-Way

Local LLM Control Plane

Route free LLM APIs through one local control plane.

通过一个本地控制平面,路由多个免费 LLM API。

Bring your own provider keys. Free-Way discovers models, normalizes OpenAI / Anthropic protocols, routes requests, and falls back across compatible providers — all from localhost.

No hosted proxy Your keys stay local No shared quota pool OpenAI + Anthropic
git clone https://github.com/GoDiao/Free-Way.git && cd Free-Way
npm install && npm run build && npm start
Free-Way Console ONLINE
localhost:8787
Endpoints
POST /v1/chat/completions OpenAI
POST /v1/messages Anthropic
GET /v1/models Models
Route Trace
01 request received 12ms
02 normalize OpenAI → Provider
03 resolve model llama-3.3-70b
04 select provider groq
05 provider error rate_limit
06 fallback route openrouter
07 stream response 200 OK
Provider Health
groq healthy
mistral healthy
openrouter fallback-ready
cloudflare healthy
gateway online 14 providers indexed 120+ models discovered OpenAI / Anthropic BYOK runtime

A local routing layer for fragmented free-tier APIs.

Free-Way acts as a local control plane between your AI tools and provider APIs. It normalizes protocol differences, resolves models, checks route availability, and falls back when a provider fails — all from localhost.

Local Control Plane

Runs entirely on your machine. Manage provider keys, monitor health, browse models, and inspect usage — from a single web console at localhost:8787. No hosted proxy. No shared quota pool.

keyslocal
consolelocalhost:8787
proxynone
modeBYOK

Protocol Normalization

Expose OpenAI and Anthropic compatible endpoints from one server. Most clients only need a Base URL change.

Model Discovery

Fetch available models from supported providers and keep a unified free-tier catalog updated where possible.

Fallback Routing

Free-tier quotas shift constantly. When one route is rate-limited or unavailable, Free-Way tries another compatible provider.

Health Checks

Monitor provider availability and latency from the console. Run on-demand checks for individual providers or all at once.

Agent Workflow

Point every AI coding tool at the same local address. Configure once — Claude Code, Cursor, Continue.dev, OpenCode, and more.

One localhost endpoint, multiple provider routes.

Your tools talk to Free-Way once. Free-Way maps each request to a compatible provider route, normalizes the protocol, and streams the response back — falling back across providers when needed.

Clients
Claude Code
Cursor
Continue.dev
OpenCode
+4 more
Free-Way Gateway
localhost:8787
/v1 · /v1/messages
normalize · route · fallback
Providers
Groq · Mistral
Cerebras · Cohere
NVIDIA · Cloudflare
OpenRouter · GitHub
+6 more

Bring your own keys from supported free-tier providers.

Free-Way does not provide free API access. You register your own keys with each provider, configure them in the console, and Free-Way handles the rest.

High-speed inference
Groq Cerebras NVIDIA NIM
General model platforms
OpenRouter Mistral Cohere GitHub Models
Regional & ecosystem providers
Cloudflare SiliconFlow Zhipu OpenCode LLM7 Kilo ZenMux

* Free-tier limits, quotas, and availability are controlled by each provider and may change at any time.

Point any AI coding agent here.

Most AI coding tools can point to custom OpenAI or Anthropic-compatible endpoints. Configure once, then use any model Free-Way discovers.

Claude Code
Anthropic
/v1/messages
Cursor
OpenAI
/v1
Continue.dev
OpenAI
/v1
OpenCode
OpenAI
/v1
Cline
OpenAI
/v1
Aider
OpenAI
/v1
Codex CLI
OpenAI
/v1
OpenClaw
Anthropic
/v1/messages

Detailed setup guides: docs/agents/ →