feat: add SiliconFlow as a supported AI provider (#137)

* feat: add SiliconFlow as a supported AI provider in documentation and configuration

* fix: update SiliconFlow configuration comment to English
This commit is contained in:
QiyuanChen
2025-12-07 09:22:57 +08:00
committed by GitHub
parent b1bc1a6dc6
commit d8cdd049d1
6 changed files with 44 additions and 7 deletions

View File

@@ -88,6 +88,7 @@ Diagrams are represented as XML that can be rendered in draw.io. The AI processe
- Ollama - Ollama
- OpenRouter - OpenRouter
- DeepSeek - DeepSeek
- SiliconFlow
All providers except AWS Bedrock and OpenRouter support custom endpoints. All providers except AWS Bedrock and OpenRouter support custom endpoints.
@@ -146,7 +147,7 @@ cp env.example .env.local
Edit `.env.local` and configure your chosen provider: Edit `.env.local` and configure your chosen provider:
- Set `AI_PROVIDER` to your chosen provider (bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek) - Set `AI_PROVIDER` to your chosen provider (bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow)
- Set `AI_MODEL` to the specific model you want to use - Set `AI_MODEL` to the specific model you want to use
- Add the required API keys for your provider - Add the required API keys for your provider
- `TEMPERATURE`: Optional temperature setting (e.g., `0` for deterministic output). Leave unset for models that don't support it (e.g., reasoning models). - `TEMPERATURE`: Optional temperature setting (e.g., `0` for deterministic output). Leave unset for models that don't support it (e.g., reasoning models).

View File

@@ -88,6 +88,7 @@ https://github.com/user-attachments/assets/b2eef5f3-b335-4e71-a755-dc2e80931979
- Ollama - Ollama
- OpenRouter - OpenRouter
- DeepSeek - DeepSeek
- SiliconFlow
除AWS Bedrock和OpenRouter外所有提供商都支持自定义端点。 除AWS Bedrock和OpenRouter外所有提供商都支持自定义端点。
@@ -146,7 +147,7 @@ cp env.example .env.local
编辑 `.env.local` 并配置您选择的提供商: 编辑 `.env.local` 并配置您选择的提供商:
-`AI_PROVIDER` 设置为您选择的提供商bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek -`AI_PROVIDER` 设置为您选择的提供商bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow
-`AI_MODEL` 设置为您要使用的特定模型 -`AI_MODEL` 设置为您要使用的特定模型
- 添加您的提供商所需的API密钥 - 添加您的提供商所需的API密钥
- `TEMPERATURE`:可选的温度设置(例如 `0` 表示确定性输出)。对于不支持此参数的模型(如推理模型),请不要设置。 - `TEMPERATURE`:可选的温度设置(例如 `0` 表示确定性输出)。对于不支持此参数的模型(如推理模型),请不要设置。

View File

@@ -88,6 +88,7 @@ https://github.com/user-attachments/assets/b2eef5f3-b335-4e71-a755-dc2e80931979
- Ollama - Ollama
- OpenRouter - OpenRouter
- DeepSeek - DeepSeek
- SiliconFlow
AWS BedrockとOpenRouter以外のすべてのプロバイダーはカスタムエンドポイントをサポートしています。 AWS BedrockとOpenRouter以外のすべてのプロバイダーはカスタムエンドポイントをサポートしています。
@@ -146,7 +147,7 @@ cp env.example .env.local
`.env.local`を編集して選択したプロバイダーを設定: `.env.local`を編集して選択したプロバイダーを設定:
- `AI_PROVIDER`を選択したプロバイダーに設定bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek - `AI_PROVIDER`を選択したプロバイダーに設定bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow
- `AI_MODEL`を使用する特定のモデルに設定 - `AI_MODEL`を使用する特定のモデルに設定
- プロバイダーに必要なAPIキーを追加 - プロバイダーに必要なAPIキーを追加
- `TEMPERATURE`:オプションの温度設定(例:`0`で決定論的な出力)。温度をサポートしないモデル(推論モデルなど)では設定しないでください。 - `TEMPERATURE`:オプションの温度設定(例:`0`で決定論的な出力)。温度をサポートしないモデル(推論モデルなど)では設定しないでください。

View File

@@ -63,6 +63,19 @@ Optional custom endpoint:
DEEPSEEK_BASE_URL=https://your-custom-endpoint DEEPSEEK_BASE_URL=https://your-custom-endpoint
``` ```
### SiliconFlow (OpenAI-compatible)
```bash
SILICONFLOW_API_KEY=your_api_key
AI_MODEL=deepseek-ai/DeepSeek-V3 # example; use any SiliconFlow model id
```
Optional custom endpoint (defaults to the recommended domain):
```bash
SILICONFLOW_BASE_URL=https://api.siliconflow.com/v1 # or https://api.siliconflow.cn/v1
```
### Azure OpenAI ### Azure OpenAI
```bash ```bash
@@ -120,7 +133,7 @@ If you only configure **one** provider's API key, the system will automatically
If you configure **multiple** API keys, you must explicitly set `AI_PROVIDER`: If you configure **multiple** API keys, you must explicitly set `AI_PROVIDER`:
```bash ```bash
AI_PROVIDER=google # or: openai, anthropic, deepseek, azure, bedrock, openrouter, ollama AI_PROVIDER=google # or: openai, anthropic, deepseek, siliconflow, azure, bedrock, openrouter, ollama
``` ```
## Model Capability Requirements ## Model Capability Requirements

View File

@@ -1,6 +1,6 @@
# AI Provider Configuration # AI Provider Configuration
# AI_PROVIDER: Which provider to use # AI_PROVIDER: Which provider to use
# Options: bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek # Options: bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow
# Default: bedrock # Default: bedrock
AI_PROVIDER=bedrock AI_PROVIDER=bedrock
@@ -42,6 +42,11 @@ AI_MODEL=global.anthropic.claude-sonnet-4-5-20250929-v1:0
# DEEPSEEK_API_KEY=sk-... # DEEPSEEK_API_KEY=sk-...
# DEEPSEEK_BASE_URL=https://api.deepseek.com/v1 # Optional: Custom endpoint # DEEPSEEK_BASE_URL=https://api.deepseek.com/v1 # Optional: Custom endpoint
# SiliconFlow Configuration (OpenAI-compatible)
# Base domain can be .com or .cn, defaults to https://api.siliconflow.com/v1
# SILICONFLOW_API_KEY=sk-...
# SILICONFLOW_BASE_URL=https://api.siliconflow.com/v1 # Optional: switch to https://api.siliconflow.cn/v1 if needed
# Langfuse Observability (Optional) # Langfuse Observability (Optional)
# Enable LLM tracing and analytics - https://langfuse.com # Enable LLM tracing and analytics - https://langfuse.com
# LANGFUSE_PUBLIC_KEY=pk-lf-... # LANGFUSE_PUBLIC_KEY=pk-lf-...

View File

@@ -17,6 +17,7 @@ export type ProviderName =
| "ollama" | "ollama"
| "openrouter" | "openrouter"
| "deepseek" | "deepseek"
| "siliconflow"
interface ModelConfig { interface ModelConfig {
model: any model: any
@@ -47,6 +48,7 @@ const PROVIDER_ENV_VARS: Record<ProviderName, string | null> = {
ollama: null, // No credentials needed for local Ollama ollama: null, // No credentials needed for local Ollama
openrouter: "OPENROUTER_API_KEY", openrouter: "OPENROUTER_API_KEY",
deepseek: "DEEPSEEK_API_KEY", deepseek: "DEEPSEEK_API_KEY",
siliconflow: "SILICONFLOW_API_KEY",
} }
/** /**
@@ -90,7 +92,7 @@ function validateProviderCredentials(provider: ProviderName): void {
* Get the AI model based on environment variables * Get the AI model based on environment variables
* *
* Environment variables: * Environment variables:
* - AI_PROVIDER: The provider to use (bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek) * - AI_PROVIDER: The provider to use (bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow)
* - AI_MODEL: The model ID/name for the selected provider * - AI_MODEL: The model ID/name for the selected provider
* *
* Provider-specific env vars: * Provider-specific env vars:
@@ -104,6 +106,8 @@ function validateProviderCredentials(provider: ProviderName): void {
* - OPENROUTER_API_KEY: OpenRouter API key * - OPENROUTER_API_KEY: OpenRouter API key
* - DEEPSEEK_API_KEY: DeepSeek API key * - DEEPSEEK_API_KEY: DeepSeek API key
* - DEEPSEEK_BASE_URL: DeepSeek endpoint (optional) * - DEEPSEEK_BASE_URL: DeepSeek endpoint (optional)
* - SILICONFLOW_API_KEY: SiliconFlow API key
* - SILICONFLOW_BASE_URL: SiliconFlow endpoint (optional, defaults to https://api.siliconflow.com/v1)
*/ */
export function getAIModel(): ModelConfig { export function getAIModel(): ModelConfig {
const modelId = process.env.AI_MODEL const modelId = process.env.AI_MODEL
@@ -139,6 +143,7 @@ export function getAIModel(): ModelConfig {
`- AWS_ACCESS_KEY_ID for Bedrock\n` + `- AWS_ACCESS_KEY_ID for Bedrock\n` +
`- OPENROUTER_API_KEY for OpenRouter\n` + `- OPENROUTER_API_KEY for OpenRouter\n` +
`- AZURE_API_KEY for Azure\n` + `- AZURE_API_KEY for Azure\n` +
`- SILICONFLOW_API_KEY for SiliconFlow\n` +
`Or set AI_PROVIDER=ollama for local Ollama.`, `Or set AI_PROVIDER=ollama for local Ollama.`,
) )
} else { } else {
@@ -259,9 +264,20 @@ export function getAIModel(): ModelConfig {
} }
break break
case "siliconflow": {
const siliconflowProvider = createOpenAI({
apiKey: process.env.SILICONFLOW_API_KEY,
baseURL:
process.env.SILICONFLOW_BASE_URL ||
"https://api.siliconflow.com/v1",
})
model = siliconflowProvider.chat(modelId)
break
}
default: default:
throw new Error( throw new Error(
`Unknown AI provider: ${provider}. Supported providers: bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek`, `Unknown AI provider: ${provider}. Supported providers: bedrock, openai, anthropic, google, azure, ollama, openrouter, deepseek, siliconflow`,
) )
} }