mirror of
https://github.com/DayuanJiang/next-ai-draw-io.git
synced 2026-01-02 22:32:27 +08:00
* Fix: remove hardcoded temperature parameter to support reasoning models * feat: make temperature configurable via AI_TEMPERATURE env var - Instead of removing temperature entirely, make it optional via env var - Set AI_TEMPERATURE=0 for deterministic output (recommended for diagrams) - Leave unset for models that don't support temperature (e.g., GPT-5.1 reasoning) * docs: add AI_TEMPERATURE env var documentation - Update env.example with AI_TEMPERATURE option - Update README.md configuration section - Add Temperature Setting section in ai-providers.md * docs: add TEMPERATURE env var documentation - Update env.example with TEMPERATURE option - Update README.md, README_CN.md, README_JA.md configuration sections - Add Temperature Setting section in ai-providers.md - Update route.ts to use TEMPERATURE env var --------- Co-authored-by: dayuan.jiang <jiangdy@amazon.co.jp>
58 lines
2.0 KiB
Plaintext
58 lines
2.0 KiB
Plaintext
# AI Provider Configuration
|
|
# AI_PROVIDER: Which provider to use
|
|
# Options: bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek
|
|
# Default: bedrock
|
|
AI_PROVIDER=bedrock
|
|
|
|
# AI_MODEL: The model ID for your chosen provider (REQUIRED)
|
|
AI_MODEL=global.anthropic.claude-sonnet-4-5-20250929-v1:0
|
|
|
|
# AWS Bedrock Configuration
|
|
# AWS_REGION=us-east-1
|
|
# AWS_ACCESS_KEY_ID=your-access-key-id
|
|
# AWS_SECRET_ACCESS_KEY=your-secret-access-key
|
|
|
|
# OpenAI Configuration
|
|
# OPENAI_API_KEY=sk-...
|
|
# OPENAI_BASE_URL=https://api.openai.com/v1 # Optional: Custom OpenAI-compatible endpoint
|
|
# OPENAI_ORGANIZATION=org-... # Optional
|
|
# OPENAI_PROJECT=proj_... # Optional
|
|
|
|
# Anthropic (Direct) Configuration
|
|
# ANTHROPIC_API_KEY=sk-ant-...
|
|
# ANTHROPIC_BASE_URL=https://your-custom-anthropic/v1
|
|
|
|
# Google Generative AI Configuration
|
|
# GOOGLE_GENERATIVE_AI_API_KEY=...
|
|
# GOOGLE_BASE_URL=https://generativelanguage.googleapis.com/v1beta # Optional: Custom endpoint
|
|
|
|
# Azure OpenAI Configuration
|
|
# AZURE_RESOURCE_NAME=your-resource-name
|
|
# AZURE_API_KEY=...
|
|
# AZURE_BASE_URL=https://your-resource.openai.azure.com # Optional: Custom endpoint (overrides resourceName)
|
|
|
|
# Ollama (Local) Configuration
|
|
# OLLAMA_BASE_URL=http://localhost:11434/api # Optional, defaults to localhost
|
|
|
|
# OpenRouter Configuration
|
|
# OPENROUTER_API_KEY=sk-or-v1-...
|
|
# OPENROUTER_BASE_URL=https://openrouter.ai/api/v1 # Optional: Custom endpoint
|
|
|
|
# DeepSeek Configuration
|
|
# DEEPSEEK_API_KEY=sk-...
|
|
# DEEPSEEK_BASE_URL=https://api.deepseek.com/v1 # Optional: Custom endpoint
|
|
|
|
# Langfuse Observability (Optional)
|
|
# Enable LLM tracing and analytics - https://langfuse.com
|
|
# LANGFUSE_PUBLIC_KEY=pk-lf-...
|
|
# LANGFUSE_SECRET_KEY=sk-lf-...
|
|
# LANGFUSE_BASEURL=https://cloud.langfuse.com # EU region, use https://us.cloud.langfuse.com for US
|
|
|
|
# Temperature (Optional)
|
|
# Controls randomness in AI responses. Lower = more deterministic.
|
|
# Leave unset for models that don't support temperature (e.g., GPT-5.1 reasoning models)
|
|
# TEMPERATURE=0
|
|
|
|
# Access Control (Optional)
|
|
# ACCESS_CODE_LIST=your-secret-code,another-code
|