fix: Remove hardcoded temperature parameter to support models that don't support it (#133)

* Fix: remove hardcoded temperature parameter to support reasoning models

* feat: make temperature configurable via AI_TEMPERATURE env var

- Instead of removing temperature entirely, make it optional via env var
- Set AI_TEMPERATURE=0 for deterministic output (recommended for diagrams)
- Leave unset for models that don't support temperature (e.g., GPT-5.1 reasoning)

* docs: add AI_TEMPERATURE env var documentation

- Update env.example with AI_TEMPERATURE option
- Update README.md configuration section
- Add Temperature Setting section in ai-providers.md

* docs: add TEMPERATURE env var documentation

- Update env.example with TEMPERATURE option
- Update README.md, README_CN.md, README_JA.md configuration sections
- Add Temperature Setting section in ai-providers.md
- Update route.ts to use TEMPERATURE env var

---------

Co-authored-by: dayuan.jiang <jiangdy@amazon.co.jp>
This commit is contained in:
Biki Kalita
2025-12-06 22:04:59 +05:30
committed by GitHub
parent 05d58025c4
commit 8b578a456e
6 changed files with 25 additions and 1 deletions

View File

@@ -149,6 +149,7 @@ cp env.example .env.local
-`AI_PROVIDER` 设置为您选择的提供商bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek
-`AI_MODEL` 设置为您要使用的特定模型
- 添加您的提供商所需的API密钥
- `TEMPERATURE`:可选的温度设置(例如 `0` 表示确定性输出)。对于不支持此参数的模型(如推理模型),请不要设置。
- `ACCESS_CODE_LIST` 访问密码,可选,可以使用逗号隔开多个密码。
> 警告:如果不填写 `ACCESS_CODE_LIST`,则任何人都可以直接使用你部署后的网站,可能会导致你的 token 被急速消耗完毕,建议填写此选项。