Commit Graph

15 Commits

Author SHA1 Message Date
terrydash
97bb350dd6 feat: add support for self-hosted draw.io via DRAWIO_BASE_URL environment variable (#163)
## Summary
Add support for self-hosted draw.io instances via build-time configuration.

## Problem
In some corporate environments, `embed.diagrams.net` is blocked by network policies. 
Users cannot use the application without access to the default draw.io embed URL.

## Solution
- Add `NEXT_PUBLIC_DRAWIO_BASE_URL` environment variable support
- Pass the `baseUrl` prop to the `DrawIoEmbed` component
- Configure Dockerfile to accept build-time argument for the draw.io URL

## Usage
```yaml
# docker-compose.yaml
services:
  drawio:
    image: jgraph/drawio:latest
    ports:
      - "8080:8080"
  
  next-ai-draw-io:
    build:
      context: .
      args:
        - NEXT_PUBLIC_DRAWIO_BASE_URL=http://drawio:8080
```

Or build directly:
```bash
docker build --build-arg NEXT_PUBLIC_DRAWIO_BASE_URL=http://localhost:8080 -t next-ai-draw-io .
```

**Note:** This is a build-time configuration. To change the draw.io URL, you need to rebuild the Docker image.
2025-12-09 22:00:54 +09:00
QiyuanChen
d8cdd049d1 feat: add SiliconFlow as a supported AI provider (#137)
* feat: add SiliconFlow as a supported AI provider in documentation and configuration

* fix: update SiliconFlow configuration comment to English
2025-12-07 10:22:57 +09:00
Biki Kalita
8b578a456e fix: Remove hardcoded temperature parameter to support models that don't support it (#133)
* Fix: remove hardcoded temperature parameter to support reasoning models

* feat: make temperature configurable via AI_TEMPERATURE env var

- Instead of removing temperature entirely, make it optional via env var
- Set AI_TEMPERATURE=0 for deterministic output (recommended for diagrams)
- Leave unset for models that don't support temperature (e.g., GPT-5.1 reasoning)

* docs: add AI_TEMPERATURE env var documentation

- Update env.example with AI_TEMPERATURE option
- Update README.md configuration section
- Add Temperature Setting section in ai-providers.md

* docs: add TEMPERATURE env var documentation

- Update env.example with TEMPERATURE option
- Update README.md, README_CN.md, README_JA.md configuration sections
- Add Temperature Setting section in ai-providers.md
- Update route.ts to use TEMPERATURE env var

---------

Co-authored-by: dayuan.jiang <jiangdy@amazon.co.jp>
2025-12-07 01:34:59 +09:00
Dayuan Jiang
8d898d8adc fix: revert maxDuration to static value (Next.js requirement) (#121) 2025-12-06 18:04:23 +09:00
Dayuan Jiang
1e0b1ed970 feat: make maxDuration configurable via MAX_DURATION env (#120) 2025-12-06 17:47:50 +09:00
Twelveeee
3fb349fb3e clear button cant clear error msg & feat: add setting dialog and add accesscode (#77)
* fix: clear button cant clear error msg

* new: add setting dialog and add accesscode

* fix: address review feedback - dark mode, types, formatting

* feat: only show Settings button when access code is required

* refactor: rename ACCESS_CODES to ACCESS_CODE_LIST

---------

Co-authored-by: dayuan.jiang <jdy.toh@gmail.com>
2025-12-05 22:09:34 +09:00
Dayuan Jiang
ed29e32ba3 feat: restore Langfuse observability integration (#103)
- Add lib/langfuse.ts with client, trace input/output, telemetry config
- Add instrumentation.ts for OpenTelemetry setup with Langfuse span processor
- Add /api/log-save endpoint for logging diagram saves
- Add /api/log-feedback endpoint for thumbs up/down feedback
- Update chat route with sessionId tracking and telemetry
- Add feedback buttons (thumbs up/down) to chat messages
- Add sessionId tracking throughout the app
- Update env.example with Langfuse configuration
- Add @langfuse/client, @langfuse/otel, @langfuse/tracing, @opentelemetry/sdk-trace-node
2025-12-05 21:15:02 +09:00
dayuan.jiang
ff6f130f8a refactor: remove Langfuse observability integration
- Delete lib/langfuse.ts, instrumentation.ts
- Remove API routes: log-save, log-feedback
- Remove feedback buttons (thumbs up/down) from chat
- Remove sessionId tracking throughout codebase
- Remove @langfuse/*, @opentelemetry dependencies
- Clean up env.example
2025-12-05 01:30:02 +09:00
Dayuan Jiang
fa1b02ad78 feat: integrate Langfuse for LLM observability (#66)
* feat: integrate Langfuse for LLM observability

- Add instrumentation.ts with Langfuse OpenTelemetry exporter
- Enable experimental telemetry on streamText calls
- Add instrumentationHook to Next.js config
- Install required dependencies (@vercel/otel, langfuse-vercel, etc.)

* feat: add optional Langfuse observability integration

- Add session tracking with unique sessionId per conversation
- Add user tracking via IP address (x-forwarded-for header)
- Make telemetry conditional - only enabled if LANGFUSE_PUBLIC_KEY is set
- Add environment variable validation in instrumentation.ts
- Add sessionId validation (type check + 200 char limit)
- Update env.example with Langfuse configuration docs
- Remove unused langfuse-vercel and @vercel/otel packages

* fix: remove deprecated instrumentationHook (enabled by default in Next.js 15)
2025-12-04 00:23:09 +09:00
dayuan.jiang
45ab934288 feat: add DeepSeek as AI provider
- Install @ai-sdk/deepseek package
- Add DeepSeek provider support to lib/ai-providers.ts
- Add DeepSeek configuration to env.example
- Update README.md with DeepSeek in provider list
- Support both default and custom base URL for DeepSeek
2025-12-02 11:52:09 +09:00
Dan Zheng
d4fb635d98 fix: add customize anthropic baseURL (#28)
* fix: add custom anthropic baseURL

* feat: add baseURL support for all AI providers

- Add GOOGLE_BASE_URL for Google Generative AI
- Add AZURE_BASE_URL for Azure OpenAI
- Add OLLAMA_BASE_URL support (was documented but not implemented)
- Add OPENROUTER_BASE_URL for OpenRouter
- Fix missing semicolon in Anthropic case
- Update env.example with new environment variables

Closes #20

---------

Co-authored-by: dayuan.jiang <jdy.toh@gmail.com>
2025-12-02 01:08:06 +09:00
ylxmf
d2dd501f3f feat: support OpenAI compatible llm 2025-11-21 17:03:47 +08:00
dayuan.jiang
58dcb3c41a feat: add OpenRouter support and fix input disabling
- Add OpenRouter provider support with @openrouter/ai-sdk-provider
- Fix input not disabling during 'submitted' state for fast providers
- Apply disable logic to all interactive elements (textarea, buttons, handlers)
- Clean up env.example by removing model examples and separator blocks
- Upgrade zod to v4.1.12 for compatibility with ollama-ai-provider-v2
- Add debug logging for status changes in chat components
2025-11-15 14:29:18 +09:00
dayuan.jiang
4a3abc2e39 add multiple provider 2025-11-15 13:36:42 +09:00
dayuan.jiang
34de984fb8 docs: update README and add env.example for API key configuration 2025-11-10 19:52:08 +09:00