Files
the-island/backend/.env.example
empty 3e89a17b69 fix: 修复 Twitch 集成 - 降级到 twitchio 2.x
- 从 twitchio 3.x 降级到 2.10.0 (IRC-based)
  - 3.x 使用 EventSub API 需要更复杂的配置
  - 2.x 使用 IRC 方式更简单可靠
- 简化 Twitch 配置,只需要 Token 和频道名
- 移除 client_id, client_secret, bot_id 要求
- 更新 .env.example 配置说明
2026-01-01 18:59:14 +08:00

86 lines
3.5 KiB
Plaintext

# LLM Configuration for The Island Backend
# Copy this file to .env and fill in your values
# =============================================================================
# Option 1: OpenAI (default)
# =============================================================================
# OPENAI_API_KEY=sk-xxx
# LLM_MODEL=gpt-3.5-turbo
# =============================================================================
# Option 2: Anthropic Claude
# =============================================================================
# ANTHROPIC_API_KEY=sk-ant-xxx
# LLM_MODEL=claude-3-haiku-20240307
# =============================================================================
# Option 3: Google Gemini
# =============================================================================
# GEMINI_API_KEY=xxx
# LLM_MODEL=gemini/gemini-pro
# =============================================================================
# Option 4: Azure OpenAI
# =============================================================================
# AZURE_API_KEY=xxx
# AZURE_API_BASE=https://your-resource.openai.azure.com
# LLM_MODEL=azure/your-deployment-name
# =============================================================================
# Option 5: OpenRouter (access multiple models)
# =============================================================================
# OPENROUTER_API_KEY=sk-or-xxx
# LLM_MODEL=openrouter/anthropic/claude-3-haiku
# =============================================================================
# Option 6: Local Ollama
# =============================================================================
# OLLAMA_API_BASE=http://localhost:11434
# LLM_MODEL=ollama/llama2
# =============================================================================
# Option 7: Custom/Self-hosted endpoint
# See: https://docs.litellm.ai/docs/providers
# =============================================================================
# LLM_API_BASE=http://localhost:8000/v1
# LLM_API_KEY=your-key
# LLM_API_KEY_HEADER=api-key # Optional: custom header name for API key
#
# For OpenAI-compatible API:
# LLM_MODEL=openai/your-model-name
#
# For Anthropic-compatible API:
# LLM_MODEL=anthropic/your-model-name
# =============================================================================
# Example: Xiaomi MiMo (Anthropic-compatible)
# =============================================================================
# LLM_API_BASE=https://api.xiaomimomo.com/anthropic/v1/messages
# LLM_API_KEY=your-mimo-api-key
# LLM_API_KEY_HEADER=api-key
# LLM_MODEL=anthropic/mimo-v2-flash
# LITELLM_ANTHROPIC_DISABLE_URL_SUFFIX=true # Prevent appending /v1/messages
# =============================================================================
# Force mock mode (no API calls, uses predefined responses)
# =============================================================================
# LLM_MOCK_MODE=true
# =============================================================================
# Twitch Configuration for Live Stream Integration (twitchio 2.x)
# =============================================================================
# Get your OAuth Token from: https://twitchtokengenerator.com/
# 1. Select "Bot Chat Token"
# 2. Authorize with your Twitch account
# 3. Copy the Access Token (starts with oauth:)
#
# TWITCH_TOKEN=oauth:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# TWITCH_CHANNEL_NAME=your_channel_name
# TWITCH_COMMAND_PREFIX=!
# =============================================================================
# Game Configuration
# =============================================================================
# GAME_DIFFICULTY=casual
# DEBUG=true