Set up LLM clients

Last validated:

Aperture by Tailscale is currently in alpha.

After setting up Aperture and configuring your providers, set up your LLM clients to route requests through your Aperture instance.

Aperture sits between your LLM clients and your upstream providers and routes requests from each client to the appropriate provider. When a client sends a request specifying a model name, Aperture routes it to the provider that serves that model and automatically injects authentication. Any LLM client that supports a custom base URL can use Aperture.

Use the following guides to configure an LLM client to use your Aperture instance. Refer to the provider compatibility reference for the full list of supported providers and API formats.


Configure Claude Code to route requests through your Aperture proxy.

Configure OpenAI Codex to route requests through your Aperture proxy.

Configure OpenCode to route requests through your Aperture proxy.

Configure Gemini CLI, Roo Code, Cline, and other OpenAI-compatible tools to route requests through Aperture.