Set up OpenAI
Last validated:
Configure an OpenAI provider in Aperture so your team can access GPT models through your tailnet. OpenAI is the default provider type in Aperture. When no compatibility block is specified, Aperture enables openai_chat automatically, but openai_responses is not enabled unless you set it explicitly.
Aperture routes requests based on the model name, not the LLM client. Any LLM client configured to use Aperture can access any provider your admin has set up. Refer to the provider compatibility reference for the full list of supported providers and API formats.
Prerequisites
Before you begin, you need:
- An Aperture instance accessible from your device. Refer to get started with Aperture if you have not set this up.
- An OpenAI API key.
Configure the provider
Add OpenAI as a provider in your Aperture configuration:
{
"providers": {
"openai": {
"baseurl": "https://api.openai.com/",
"apikey": "<your-openai-key>",
"models": ["gpt-5", "gpt-5-mini", "gpt-4.1"],
"name": "OpenAI",
"compatibility": {
"openai_chat": true,
"openai_responses": true
}
}
}
}
The openai_responses flag enables the Responses API, which tools like OpenAI Codex use. When you include a compatibility block, set both flags explicitly. Without a compatibility block, Aperture enables only openai_chat by default. Refer to the provider compatibility reference for the full list of flags.
After configuring the provider:
- Grant model access to the users or groups that need these models.
- Set up LLM clients to connect coding tools through Aperture.
Verify the provider
-
Open the Aperture dashboard and confirm the provider appears with the expected models.
-
Send a test request through a connected coding tool (such as Claude Code or Cursor), or use
curl:curl http://<aperture-address>/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"model": "<model-name>", "messages": [{"role": "user", "content": "hello"}]}'Replace
<aperture-address>with your Aperture instance address and<model-name>with one of the models you configured for this provider. -
Check the Aperture dashboard session list for a new entry. The session shows the model name, token counts, and timestamp.
If the request fails, refer to the Aperture troubleshooting guide.