A local routing layer for fragmented free-tier APIs.
Free-Way acts as a local control plane between your AI tools and provider APIs. It normalizes protocol differences, resolves models, checks route availability, and falls back when a provider fails — all from localhost.
Local Control Plane
Runs entirely on your machine. Manage provider keys, monitor health,
browse models, and inspect usage — from a single web console at
localhost:8787.
No hosted proxy. No shared quota pool.
Protocol Normalization
Expose OpenAI and Anthropic compatible endpoints from one server. Most clients only need a Base URL change.
Model Discovery
Fetch available models from supported providers and keep a unified free-tier catalog updated where possible.
Fallback Routing
Free-tier quotas shift constantly. When one route is rate-limited or unavailable, Free-Way tries another compatible provider.
Health Checks
Monitor provider availability and latency from the console. Run on-demand checks for individual providers or all at once.
Agent Workflow
Point every AI coding tool at the same local address. Configure once — Claude Code, Cursor, Continue.dev, OpenCode, and more.
One localhost endpoint, multiple provider routes.
Your tools talk to Free-Way once. Free-Way maps each request to a compatible provider route, normalizes the protocol, and streams the response back — falling back across providers when needed.
Bring your own keys from supported free-tier providers.
Free-Way does not provide free API access. You register your own keys with each provider, configure them in the console, and Free-Way handles the rest.
* Free-tier limits, quotas, and availability are controlled by each provider and may change at any time.
Point any AI coding agent here.
Most AI coding tools can point to custom OpenAI or Anthropic-compatible endpoints. Configure once, then use any model Free-Way discovers.
/v1/messages/v1/v1/v1/v1/v1/v1/v1/messagesDetailed setup guides: docs/agents/ →