API Reference
Routify exposes an OpenAI-compatible API at https://routify.bytedance.city/v1.
Endpoints
- Chat Completions — POST
/v1/chat/completions - Embeddings — POST
/v1/embeddings - List Models — GET
/v1/models - Authentication — Bearer header format & key creation
- Rate Limits — per-tier limits + 429 handling
- Errors — error envelope and codes
Anthropic-compatible mirror
Routify also exposes an Anthropic Messages API at https://routify.bytedance.city/v1/anthropic. The protocol is forwarded verbatim — cache_control, anthropic-version, anthropic-beta headers all pass through. This is the recommended endpoint for Claude Code / any Anthropic SDK consumer.
Cost transparency headers
Every response from /v1/chat/completions carries:
X-Routify-Cost-USD— exact dollars spent on this requestX-Routify-Cost-CNY— exact RMB equivalent at current rateX-Routify-Model-Id— the upstream model that actually served (after smart-router fallback, may differ from requested)X-Routify-Latency-Ms— wall-clock latencyX-Routify-Provider— upstream channel (e.g.deepseek-direct,kimi-aliyun)
SDKs
Use any OpenAI-compatible SDK. Just override the base URL.
from openai import OpenAI
client = OpenAI(base_url="https://routify.bytedance.city/v1", api_key="rtf_...")import OpenAI from "openai";
const client = new OpenAI({ baseURL: "https://routify.bytedance.city/v1", apiKey: "rtf_..." });curl https://routify.bytedance.city/v1/chat/completions \
-H "Authorization: Bearer rtf_..." \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-chat",
"messages": [{"role": "user", "content": "Hello!"}]
}'