OpenAI-Compatible Provider Routing
CCS can bridge Claude Code into OpenAI-compatible providers through a local Anthropic-compatible proxy. This is useful for Hugging Face Inference Providers, OpenRouter, Ollama, llama.cpp, OpenAI-compatible self-hosted gateways, and similar APIs.Quick Start
Create an API profile, then launch it normally:127.0.0.1, translates Anthropic /v1/messages
requests into OpenAI chat-completions requests, and translates streaming
responses back into Anthropic SSE.
Manual Proxy Lifecycle
Use theccs proxy command when you want to start, inspect, activate, or stop
the local proxy explicitly:
ccs proxy activate prints the local runtime contract for the selected
profile, including ANTHROPIC_BASE_URL, ANTHROPIC_AUTH_TOKEN, model
defaults, timeout settings, telemetry suppression, and NO_PROXY.
Adaptive Ports
CCS stores local proxy state per profile, so multiple compatible profiles can run at the same time. Port selection precedence:- CLI
--port proxy.profile_ports[profile]- shared preferred
proxy.port - adaptive per-profile fallback
proxy.port: 3456 values are treated as unset so older configs
move onto adaptive ports instead of staying on the hot legacy default. Pin
3456 explicitly with --port or proxy.profile_ports only if you really need
that exact binding.
Request-Time Routing
The proxy can route a request to another compatible profile or model at request time:| Selector | Example | Behavior |
|---|---|---|
profile:model | deepseek:deepseek-reasoner | route to exact profile and model |
profile | openrouter | route to that profile default |
| plain model id | deepseek-chat | match configured model slots exactly |
proxy.routing:
Related Project
claude-code-router is a standalone router whose transformer architecture informed this CCS flow. Use CCR when you want a standalone router. Use CCS when you want routing integrated with CCS profiles, runtime bridges, and theccs command surface.
