From e7e544174f969949711340784932ab5161752fea Mon Sep 17 00:00:00 2001 From: Peter Steinberger Date: Mon, 12 Jan 2026 22:45:00 +0000 Subject: [PATCH] docs: add region guidance for hosted minimax --- docs/gateway/local-models.md | 5 +++++ docs/start/faq.md | 4 ++++ 2 files changed, 9 insertions(+) diff --git a/docs/gateway/local-models.md b/docs/gateway/local-models.md index e9c83136c..f0fd06383 100644 --- a/docs/gateway/local-models.md +++ b/docs/gateway/local-models.md @@ -113,6 +113,11 @@ Keep hosted models configured even when running local; use `models.mode: "merge" Swap the primary and fallback order; keep the same providers block and `models.mode: "merge"` so you can fall back to Sonnet or GPT-4.1 when the local box is down. +### Regional hosting / data routing + +- Hosted MiniMax/Kimi/GLM variants also exist on OpenRouter with region-pinned endpoints (e.g., US-hosted). Pick the regional variant there to keep traffic in your chosen jurisdiction while still using `models.mode: "merge"` for Anthropic/OpenAI fallbacks. +- Local-only remains the strongest privacy path; hosted regional routing is the middle ground when you need provider features but want control over data flow. + ## Other OpenAI-compatible local proxies vLLM, LiteLLM, OAI-proxy, or custom gateways work if they expose an OpenAI-style `/v1` endpoint. Replace the provider block above with your endpoint and model ID: diff --git a/docs/start/faq.md b/docs/start/faq.md index 8845b284b..66ba92764 100644 --- a/docs/start/faq.md +++ b/docs/start/faq.md @@ -124,6 +124,10 @@ Clawdbot supports **OpenAI Code (Codex)** via OAuth or by reusing your Codex CLI Usually no. Clawdbot needs large context + strong safety; small cards truncate and leak. If you must, run the **largest** MiniMax M2.1 build you can locally (LM Studio) and see [/gateway/local-models](/gateway/local-models). Smaller/quantized models increase prompt-injection risk — see [Security](/gateway/security). +### How do I keep hosted model traffic in a specific region? + +Pick region-pinned endpoints. OpenRouter exposes US-hosted options for MiniMax, Kimi, and GLM; choose the US-hosted variant to keep data in-region. You can still list Anthropic/OpenAI alongside these by using `models.mode: "merge"` so fallbacks stay available while respecting the regioned provider you select. + ### Can I use Bun? Bun is supported for faster TypeScript execution, but **WhatsApp requires Node** in this ecosystem. The wizard lets you pick the runtime; choose **Node** if you use WhatsApp.