Set up LLM providers
Last validated:
Aperture routes LLM requests from configured LLM clients to upstream providers on behalf of your users. Each provider requires API credentials and a configuration entry in the Aperture configuration. After you configure a provider, any LLM client set up to use Aperture can access models from that provider automatically.
Aperture supports all the main LLM provider types, each with a different authentication method and API format.
Any provider that offers an OpenAI-compatible API (such as Anthropic, OpenAI, or Groq) can be configured directly in the Aperture configuration without a dedicated guide. Set the provider's baseurl, apikey, and models fields, and Aperture handles the rest.
The following guides cover providers that require additional setup beyond a base URL and API key. Refer to the provider compatibility reference for the full list of supported providers, API formats, and compatibility flags.
Use Vertex AI with Aperture
Configure a Vertex AI provider in Aperture with a GCP service account and key file so your team can access Gemini and Claude models through Aperture.
Set up a Vertex AI Express provider
Configure a Vertex AI Express provider in Aperture with a Google Cloud API key so your team can access Gemini models through your tailnet without managing service accounts.