feat: add support for custom AI Gateway base URL (#315)

* feat: add support for custom AI Gateway base URL

- Add createGateway support with configurable baseURL
- Allow AI_GATEWAY_BASE_URL environment variable for:
  * Local development with custom Gateway
  * Self-hosted AI Gateway deployments
  * Enterprise proxy configurations
- Maintain backward compatibility: defaults to Vercel Gateway when not set
- Update documentation with usage examples and configuration notes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

* fix: remove errant character in error message

---------

Co-authored-by: Claude Sonnet 4.5 <noreply@anthropic.com>
Co-authored-by: dayuan.jiang <jdy.toh@gmail.com>
This commit is contained in:
RainX
2025-12-18 21:00:21 +08:00
committed by GitHub
parent 81eb71e704
commit a91bd9d1e8
3 changed files with 36 additions and 2 deletions

View File

@@ -140,17 +140,36 @@ OLLAMA_BASE_URL=http://localhost:11434
Vercel AI Gateway provides unified access to multiple AI providers through a single API key. This simplifies authentication and allows you to switch between providers without managing multiple API keys. Vercel AI Gateway provides unified access to multiple AI providers through a single API key. This simplifies authentication and allows you to switch between providers without managing multiple API keys.
**Basic Usage (Vercel-hosted Gateway):**
```bash ```bash
AI_GATEWAY_API_KEY=your_gateway_api_key AI_GATEWAY_API_KEY=your_gateway_api_key
AI_MODEL=openai/gpt-4o AI_MODEL=openai/gpt-4o
``` ```
**Custom Gateway URL (for local development or self-hosted Gateway):**
```bash
AI_GATEWAY_API_KEY=your_custom_api_key
AI_GATEWAY_BASE_URL=https://your-custom-gateway.com/v1/ai
AI_MODEL=openai/gpt-4o
```
Model format uses `provider/model` syntax: Model format uses `provider/model` syntax:
- `openai/gpt-4o` - OpenAI GPT-4o - `openai/gpt-4o` - OpenAI GPT-4o
- `anthropic/claude-sonnet-4-5` - Anthropic Claude Sonnet 4.5 - `anthropic/claude-sonnet-4-5` - Anthropic Claude Sonnet 4.5
- `google/gemini-2.0-flash` - Google Gemini 2.0 Flash - `google/gemini-2.0-flash` - Google Gemini 2.0 Flash
**Configuration notes:**
- If `AI_GATEWAY_BASE_URL` is not set, the default Vercel Gateway URL (`https://ai-gateway.vercel.sh/v1/ai`) is used
- Custom base URL is useful for:
- Local development with a custom Gateway instance
- Self-hosted AI Gateway deployments
- Enterprise proxy configurations
- When using a custom base URL, you must also provide `AI_GATEWAY_API_KEY`
Get your API key from the [Vercel AI Gateway dashboard](https://vercel.com/ai-gateway). Get your API key from the [Vercel AI Gateway dashboard](https://vercel.com/ai-gateway).
## Auto-Detection ## Auto-Detection

View File

@@ -72,6 +72,8 @@ AI_MODEL=global.anthropic.claude-sonnet-4-5-20250929-v1:0
# Get your API key from: https://vercel.com/ai-gateway # Get your API key from: https://vercel.com/ai-gateway
# Model format: "provider/model" e.g., "openai/gpt-4o", "anthropic/claude-sonnet-4-5" # Model format: "provider/model" e.g., "openai/gpt-4o", "anthropic/claude-sonnet-4-5"
# AI_GATEWAY_API_KEY=... # AI_GATEWAY_API_KEY=...
# AI_GATEWAY_BASE_URL=https://your-custom-gateway.com/v1/ai # Optional: Custom Gateway URL (for local dev or self-hosted Gateway)
# # If not set, uses Vercel default: https://ai-gateway.vercel.sh/v1/ai
# Langfuse Observability (Optional) # Langfuse Observability (Optional)
# Enable LLM tracing and analytics - https://langfuse.com # Enable LLM tracing and analytics - https://langfuse.com

View File

@@ -2,7 +2,7 @@ import { createAmazonBedrock } from "@ai-sdk/amazon-bedrock"
import { createAnthropic } from "@ai-sdk/anthropic" import { createAnthropic } from "@ai-sdk/anthropic"
import { azure, createAzure } from "@ai-sdk/azure" import { azure, createAzure } from "@ai-sdk/azure"
import { createDeepSeek, deepseek } from "@ai-sdk/deepseek" import { createDeepSeek, deepseek } from "@ai-sdk/deepseek"
import { gateway } from "@ai-sdk/gateway" import { createGateway, gateway } from "@ai-sdk/gateway"
import { createGoogleGenerativeAI, google } from "@ai-sdk/google" import { createGoogleGenerativeAI, google } from "@ai-sdk/google"
import { createOpenAI, openai } from "@ai-sdk/openai" import { createOpenAI, openai } from "@ai-sdk/openai"
import { fromNodeProviderChain } from "@aws-sdk/credential-providers" import { fromNodeProviderChain } from "@aws-sdk/credential-providers"
@@ -683,7 +683,20 @@ export function getAIModel(overrides?: ClientOverrides): ModelConfig {
// Vercel AI Gateway - unified access to multiple AI providers // Vercel AI Gateway - unified access to multiple AI providers
// Model format: "provider/model" e.g., "openai/gpt-4o", "anthropic/claude-sonnet-4-5" // Model format: "provider/model" e.g., "openai/gpt-4o", "anthropic/claude-sonnet-4-5"
// See: https://vercel.com/ai-gateway // See: https://vercel.com/ai-gateway
const apiKey = overrides?.apiKey || process.env.AI_GATEWAY_API_KEY
const baseURL =
overrides?.baseUrl || process.env.AI_GATEWAY_BASE_URL
// Only use custom configuration if explicitly set (local dev or custom Gateway)
// Otherwise undefined → AI SDK uses Vercel default (https://ai-gateway.vercel.sh/v1/ai) + OIDC
if (baseURL || overrides?.apiKey) {
const customGateway = createGateway({
apiKey,
...(baseURL && { baseURL }),
})
model = customGateway(modelId)
} else {
model = gateway(modelId) model = gateway(modelId)
}
break break
} }