Skip to main content
To use a self-hosted LLM gateway (for example LiteLLM, Portkey, or an in-house proxy) as the inference provider, set inferenceProvider to gateway and supply the base URL and credentials described below. The gateway must implement the Anthropic Messages API:
  • POST /v1/messages with streaming and tool use is required.
  • GET /v1/models is optional. If the gateway implements it, Cowork on 3P auto-discovers available models; if not, set inferenceModels explicitly.
The data-residency and “no conversation data sent to Anthropic” statements elsewhere in these pages apply to Vertex AI and Bedrock only. When you use a gateway, data handling is determined by the gateway you operate and the upstream provider it routes to.

Configuration keys

SettingRequiredDescription
Gateway base URL
inferenceGatewayBaseUrl
YesGateway base URL. Must be https://.
Gateway API key
inferenceGatewayApiKey
Unless using sso or a credential helperAPI key sent to the gateway. The field cannot be empty, so if your gateway authenticates by network identity and does not require a key, set a placeholder value.
Gateway auth scheme
inferenceGatewayAuthScheme
NoHow the credential is sent. bearer (default) sends Authorization: Bearer <key>. x-api-key sends the x-api-key header instead. sso obtains the credential from the gateway’s own browser-based sign-in (OAuth 2.0 authorization server metadata at <inferenceGatewayBaseUrl>/.well-known/oauth-authorization-server and the device-authorization grant), in which case inferenceGatewayApiKey is not required.
Gateway extra headers
inferenceGatewayHeaders
NoJSON string array of additional HTTP headers sent on every inference request, in "Name: Value" form, for example ["X-Org-Id: team1"].
As an alternative to a static inferenceGatewayApiKey, configure an inferenceCredentialHelper executable that prints the gateway credential to stdout.

Example

<key>inferenceProvider</key>
<string>gateway</string>
<key>inferenceGatewayBaseUrl</key>
<string>https://llm-gateway.example.corp</string>
<key>inferenceGatewayApiKey</key>
<string>your-gateway-key</string>