Use Codex with Aperture

Last validated:

Aperture by Tailscale is currently in alpha.

Configure Codex to send requests through Aperture by Tailscale so your organization gets centralized API key management, usage tracking, and session logging.

Prerequisites

Before you begin, you need:

  • An Aperture instance with at least one configured OpenAI-compatible provider, accessible from your device. Refer to get started with Aperture if you have not set this up.
  • The Aperture host URL (default: http://ai) accessible from your device. Use http://, not https://.
  • Codex installed on your device.

To avoid unexpected TLS issues, use http:// for the Aperture URL when configuring LLM clients. All connections remain encrypted using WireGuard, even when HTTPS is not used.

Aperture routes requests based on the model name, not the LLM client. Any LLM client configured to use Aperture can access any provider your admin has set up. Refer to the provider compatibility reference for the full list of supported providers and API formats.

Configure Codex

To configure Codex to use Aperture, create or edit the Codex configuration file (~/.codex/config.toml) to use the Aperture URL as the base_url and set the model to a Codex-compatible model:

model = "gpt-5.2-codex"
model_provider = "llm-ai-ts-net"
model_reasoning_effort = "high"

[model_providers.llm-ai-ts-net]
name = "Tailscale AI Gateway"
base_url = "http://ai/v1" # Required: Aperture URL

# Required for gpt-5-codex models
wire_api = "responses"

The wire_api = "responses" setting configures Codex to use the OpenAI Responses API format. You do not need to configure an API key because Aperture injects credentials automatically.

Verify the connection

  1. Send a test message in Codex.
  2. Open the Aperture dashboard at http://ai/ui/ and confirm the request appears on the Logs page.

If the request does not appear, refer to the Aperture troubleshooting guide.

Next steps