diff --git a/README.md b/README.md index f6d49a0..09008aa 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,6 @@ Demo site: [https://next-ai-draw-io.vercel.app/](https://next-ai-draw-io.vercel. - **Interactive Chat Interface**: Communicate with AI to refine your diagrams in real-time - **Smart Editing**: Modify existing diagrams using simple text prompts - **Targeted XML Editing**: AI can now make precise edits to specific parts of diagrams without regenerating the entire XML, making updates faster and more efficient -- **Improved XML Handling**: Automatic formatting of single-line XML for better compatibility and reliability ## How It Works @@ -32,79 +31,16 @@ This application supports multiple AI providers, making it easy to deploy with y ### Supported Providers -| Provider | Status | Best For | -|----------|--------|----------| -| **AWS Bedrock** | ✅ Default | Claude models via AWS infrastructure | -| **OpenAI** | ✅ Supported | GPT-4, GPT-5, and reasoning models | -| **Anthropic** | ✅ Supported | Direct access to Claude models | -| **Google AI** | ✅ Supported | Gemini models with multi-modal capabilities | -| **Azure OpenAI** | ✅ Supported | Enterprise OpenAI deployments | -| **Ollama** | ✅ Supported | Local/self-hosted open source models | +| Provider | Status | Best For | +| ---------------- | ------------ | ------------------------------------------- | +| **AWS Bedrock** | ✅ Default | Claude models via AWS infrastructure | +| **OpenAI** | ✅ Supported | GPT-4, GPT-5, and reasoning models | +| **Anthropic** | ✅ Supported | Direct access to Claude models | +| **Google AI** | ✅ Supported | Gemini models with multi-modal capabilities | +| **Azure OpenAI** | ✅ Supported | Enterprise OpenAI deployments | +| **Ollama** | ✅ Supported | Local/self-hosted open source models | -### Quick Setup by Provider - -#### AWS Bedrock (Default) -```bash -AI_PROVIDER=bedrock -AI_MODEL=global.anthropic.claude-sonnet-4-5-20250929-v1:0 -AWS_REGION=us-east-1 -AWS_ACCESS_KEY_ID=your-access-key -AWS_SECRET_ACCESS_KEY=your-secret-key -``` - -#### OpenAI -```bash -AI_PROVIDER=openai -AI_MODEL=gpt-4o -OPENAI_API_KEY=sk-... -``` - -#### Anthropic -```bash -AI_PROVIDER=anthropic -AI_MODEL=claude-sonnet-4-5 -ANTHROPIC_API_KEY=sk-ant-... -``` - -#### Google Generative AI -```bash -AI_PROVIDER=google -AI_MODEL=gemini-2.5-flash -GOOGLE_GENERATIVE_AI_API_KEY=... -``` - -#### Azure OpenAI -```bash -AI_PROVIDER=azure -AI_MODEL=your-deployment-name -AZURE_RESOURCE_NAME=your-resource -AZURE_API_KEY=... -``` - -#### Ollama (Local) -```bash -AI_PROVIDER=ollama -AI_MODEL=phi3 -OLLAMA_BASE_URL=http://localhost:11434/api # Optional -``` -Note: Install models locally first with `ollama pull ` - -### Recommended Models - -**Best Quality:** -- AWS Bedrock: `global.anthropic.claude-sonnet-4-5-20250929-v1:0` -- Anthropic: `claude-sonnet-4-5` -- OpenAI: `gpt-4o` or `gpt-5` - -**Best Speed:** -- Google: `gemini-2.5-flash` -- OpenAI: `gpt-4o` -- Anthropic: `claude-haiku-4-5` - -**Best Cost:** -- Ollama: Free (local models) -- Google: `gemini-1.5-flash-8b` -- OpenAI: `gpt-4o-mini` +Note that `claude-sonnet-4-5` has trained on draw.io diagrams with AWS logos, so if you want to create AWS architecture diagrams, this is the best choice. ## Getting Started @@ -130,13 +66,14 @@ yarn install Create a `.env.local` file in the root directory: ```bash -cp .env.example .env.local +cp env.example .env.local ``` Edit `.env.local` and configure your chosen provider: -- Set `AI_PROVIDER` to your chosen provider (bedrock, openai, anthropic, google, azure, ollama) -- Set `AI_MODEL` to the specific model you want to use -- Add the required API keys for your provider + +- Set `AI_PROVIDER` to your chosen provider (bedrock, openai, anthropic, google, azure, ollama) +- Set `AI_MODEL` to the specific model you want to use +- Add the required API keys for your provider See the [Multi-Provider Support](#multi-provider-support) section above for provider-specific configuration examples. @@ -157,6 +94,8 @@ Check out the [Next.js deployment documentation](https://nextjs.org/docs/app/bui Or you can deploy by this button. [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FDayuanJiang%2Fnext-ai-draw-io) +Be sure to **set the environment variables** in the Vercel dashboard as you did in your local `.env.local` file. + ## Project Structure ```