* feat: add Vercel AI Gateway support - Updated environment configuration to include AI_GATEWAY_API_KEY for unified access to multiple AI providers. - Added gateway provider to the list of supported AI providers in the codebase. - Enhanced documentation to explain the usage of Vercel AI Gateway and its model format. This change simplifies authentication and allows users to switch between providers seamlessly. * Update package @ai-sdk/gateway to latest version 2.0.21
4.6 KiB
AI Provider Configuration
This guide explains how to configure different AI model providers for next-ai-draw-io.
Quick Start
- Copy
.env.exampleto.env.local - Set your API key for your chosen provider
- Set
AI_MODELto your desired model - Run
npm run dev
Supported Providers
Google Gemini
GOOGLE_GENERATIVE_AI_API_KEY=your_api_key
AI_MODEL=gemini-2.0-flash
Optional custom endpoint:
GOOGLE_BASE_URL=https://your-custom-endpoint
OpenAI
OPENAI_API_KEY=your_api_key
AI_MODEL=gpt-4o
Optional custom endpoint (for OpenAI-compatible services):
OPENAI_BASE_URL=https://your-custom-endpoint/v1
Anthropic
ANTHROPIC_API_KEY=your_api_key
AI_MODEL=claude-sonnet-4-5-20250514
Optional custom endpoint:
ANTHROPIC_BASE_URL=https://your-custom-endpoint
DeepSeek
DEEPSEEK_API_KEY=your_api_key
AI_MODEL=deepseek-chat
Optional custom endpoint:
DEEPSEEK_BASE_URL=https://your-custom-endpoint
SiliconFlow (OpenAI-compatible)
SILICONFLOW_API_KEY=your_api_key
AI_MODEL=deepseek-ai/DeepSeek-V3 # example; use any SiliconFlow model id
Optional custom endpoint (defaults to the recommended domain):
SILICONFLOW_BASE_URL=https://api.siliconflow.com/v1 # or https://api.siliconflow.cn/v1
Azure OpenAI
AZURE_API_KEY=your_api_key
AZURE_RESOURCE_NAME=your-resource-name # Required: your Azure resource name
AI_MODEL=your-deployment-name
Or use a custom endpoint instead of resource name:
AZURE_API_KEY=your_api_key
AZURE_BASE_URL=https://your-resource.openai.azure.com # Alternative to AZURE_RESOURCE_NAME
AI_MODEL=your-deployment-name
Optional reasoning configuration:
AZURE_REASONING_EFFORT=low # Optional: low, medium, high
AZURE_REASONING_SUMMARY=detailed # Optional: none, brief, detailed
AWS Bedrock
AWS_REGION=us-west-2
AWS_ACCESS_KEY_ID=your_access_key_id
AWS_SECRET_ACCESS_KEY=your_secret_access_key
AI_MODEL=anthropic.claude-sonnet-4-5-20250514-v1:0
Note: On AWS (Lambda, EC2 with IAM role), credentials are automatically obtained from the IAM role.
OpenRouter
OPENROUTER_API_KEY=your_api_key
AI_MODEL=anthropic/claude-sonnet-4
Optional custom endpoint:
OPENROUTER_BASE_URL=https://your-custom-endpoint
Ollama (Local)
AI_PROVIDER=ollama
AI_MODEL=llama3.2
Optional custom URL:
OLLAMA_BASE_URL=http://localhost:11434
Vercel AI Gateway
Vercel AI Gateway provides unified access to multiple AI providers through a single API key. This simplifies authentication and allows you to switch between providers without managing multiple API keys.
AI_GATEWAY_API_KEY=your_gateway_api_key
AI_MODEL=openai/gpt-4o
Model format uses provider/model syntax:
openai/gpt-4o- OpenAI GPT-4oanthropic/claude-sonnet-4-5- Anthropic Claude Sonnet 4.5google/gemini-2.0-flash- Google Gemini 2.0 Flash
Get your API key from the Vercel AI Gateway dashboard.
Auto-Detection
If you only configure one provider's API key, the system will automatically detect and use that provider. No need to set AI_PROVIDER.
If you configure multiple API keys, you must explicitly set AI_PROVIDER:
AI_PROVIDER=google # or: openai, anthropic, deepseek, siliconflow, azure, bedrock, openrouter, ollama, gateway
Model Capability Requirements
This task requires exceptionally strong model capabilities, as it involves generating long-form text with strict formatting constraints (draw.io XML).
Recommended models:
- Claude Sonnet 4.5 / Opus 4.5
Note on Ollama: While Ollama is supported as a provider, it's generally not practical for this use case unless you're running high-capability models like DeepSeek R1 or Qwen3-235B locally.
Temperature Setting
You can optionally configure the temperature via environment variable:
TEMPERATURE=0 # More deterministic output (recommended for diagrams)
Important: Leave TEMPERATURE unset for models that don't support temperature settings, such as:
- GPT-5.1 and other reasoning models
- Some specialized models
When unset, the model uses its default behavior.
Recommendations
- Best experience: Use models with vision support (GPT-4o, Claude, Gemini) for image-to-diagram features
- Budget-friendly: DeepSeek offers competitive pricing
- Privacy: Use Ollama for fully local, offline operation (requires powerful hardware)
- Flexibility: OpenRouter provides access to many models through a single API