feat: add multi-provider LLM support with LiteLLM

- Replace openai with litellm for unified LLM interface
- Support 100+ providers: OpenAI, Anthropic, Gemini, Azure, Ollama, etc.
- Add custom API base URL support (LLM_API_BASE)
- Add .env file support with python-dotenv
- Add .env.example configuration template

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
empty
2026-01-01 13:49:28 +08:00
parent 64ed46215f
commit 7e3872cdd8
5 changed files with 173 additions and 34 deletions

View File

@@ -3,6 +3,10 @@ FastAPI entry point for the interactive live-stream game backend.
Configures the application, WebSocket routes, and lifecycle events.
"""
# Load .env file before any other imports
from dotenv import load_dotenv
load_dotenv()
import logging
from contextlib import asynccontextmanager
from pathlib import Path