Set up LLM clients
Last validated:
After setting up Aperture and configuring your providers, set up your LLM clients to route requests through your Aperture instance.
Aperture sits between your LLM clients and your upstream providers and routes requests from each client to the appropriate provider. When a client sends a request specifying a model name, Aperture routes it to the provider that serves that model and automatically injects authentication. Any LLM client that supports a custom base URL can use Aperture.
Use the following guides to configure an LLM client to use your Aperture instance. Refer to the provider compatibility reference for the full list of supported providers and API formats.
Use Claude Code with Aperture
Configure Claude Code to route requests through your Aperture proxy.
Use Codex with Aperture
Configure OpenAI Codex to route requests through your Aperture proxy.
Use OpenCode with Aperture
Configure OpenCode to route requests through your Aperture proxy.
Use OpenAI-compatible tools with Aperture
Configure Gemini CLI, Roo Code, Cline, and other OpenAI-compatible tools to route requests through Aperture.