fix: align opencode-zen provider setup
This commit is contained in:
162
docs/concepts/model-providers.md
Normal file
162
docs/concepts/model-providers.md
Normal file
@@ -0,0 +1,162 @@
|
||||
---
|
||||
summary: "Model provider overview with example configs + CLI flows"
|
||||
read_when:
|
||||
- You need a provider-by-provider model setup reference
|
||||
- You want example configs or CLI onboarding commands for model providers
|
||||
---
|
||||
# Model providers
|
||||
|
||||
This page covers **LLM/model providers** (not chat providers like WhatsApp/Telegram).
|
||||
For model selection rules, see [/concepts/models](/concepts/models).
|
||||
|
||||
## Quick rules
|
||||
|
||||
- Model refs use `provider/model` (example: `opencode/claude-opus-4-5`).
|
||||
- If you set `agents.defaults.models`, it becomes the allowlist.
|
||||
- CLI helpers: `clawdbot onboard`, `clawdbot models list`, `clawdbot models set <provider/model>`.
|
||||
|
||||
## Built-in providers (pi-ai catalog)
|
||||
|
||||
Clawdbot ships with the pi‑ai catalog. These providers require **no**
|
||||
`models.providers` config; just set auth + pick a model.
|
||||
|
||||
### OpenAI
|
||||
|
||||
- Provider: `openai`
|
||||
- Auth: `OPENAI_API_KEY`
|
||||
- Example model: `openai/gpt-5.2`
|
||||
- CLI: `clawdbot onboard --auth-choice openai-api-key`
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: { defaults: { model: { primary: "openai/gpt-5.2" } } }
|
||||
}
|
||||
```
|
||||
|
||||
### Anthropic
|
||||
|
||||
- Provider: `anthropic`
|
||||
- Auth: `ANTHROPIC_API_KEY` or `claude setup-token`
|
||||
- Example model: `anthropic/claude-opus-4-5`
|
||||
- CLI: `clawdbot onboard --auth-choice setup-token`
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: { defaults: { model: { primary: "anthropic/claude-opus-4-5" } } }
|
||||
}
|
||||
```
|
||||
|
||||
### OpenAI Code (Codex)
|
||||
|
||||
- Provider: `openai-codex`
|
||||
- Auth: OAuth or Codex CLI (`~/.codex/auth.json`)
|
||||
- Example model: `openai-codex/gpt-5.2`
|
||||
- CLI: `clawdbot onboard --auth-choice openai-codex` or `codex-cli`
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: { defaults: { model: { primary: "openai-codex/gpt-5.2" } } }
|
||||
}
|
||||
```
|
||||
|
||||
### OpenCode Zen
|
||||
|
||||
- Provider: `opencode`
|
||||
- Auth: `OPENCODE_API_KEY` (or `OPENCODE_ZEN_API_KEY`)
|
||||
- Example model: `opencode/claude-opus-4-5`
|
||||
- CLI: `clawdbot onboard --auth-choice opencode-zen`
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: { defaults: { model: { primary: "opencode/claude-opus-4-5" } } }
|
||||
}
|
||||
```
|
||||
|
||||
### Google Gemini (API key)
|
||||
|
||||
- Provider: `google`
|
||||
- Auth: `GEMINI_API_KEY`
|
||||
- Example model: `google/gemini-3-pro`
|
||||
- CLI: `clawdbot onboard --auth-choice gemini-api-key`
|
||||
|
||||
### Google Vertex / Antigravity / Gemini CLI
|
||||
|
||||
- Providers: `google-vertex`, `google-antigravity`, `google-gemini-cli`
|
||||
- Auth: Vertex uses gcloud ADC; Antigravity/Gemini CLI use their respective auth flows
|
||||
- CLI: `clawdbot onboard --auth-choice antigravity` (others via interactive wizard)
|
||||
|
||||
### Z.AI (GLM)
|
||||
|
||||
- Provider: `zai`
|
||||
- Auth: `ZAI_API_KEY`
|
||||
- Example model: `zai/glm-4.7`
|
||||
- CLI: `clawdbot onboard --auth-choice zai-api-key`
|
||||
- Aliases: `z.ai/*` and `z-ai/*` normalize to `zai/*`
|
||||
|
||||
### Other built-in providers
|
||||
|
||||
- OpenRouter: `openrouter` (`OPENROUTER_API_KEY`)
|
||||
- Example model: `openrouter/anthropic/claude-sonnet-4-5`
|
||||
- xAI: `xai` (`XAI_API_KEY`)
|
||||
- Groq: `groq` (`GROQ_API_KEY`)
|
||||
- Cerebras: `cerebras` (`CEREBRAS_API_KEY`)
|
||||
- Mistral: `mistral` (`MISTRAL_API_KEY`)
|
||||
- GitHub Copilot: `github-copilot` (`COPILOT_GITHUB_TOKEN` / `GH_TOKEN` / `GITHUB_TOKEN`)
|
||||
|
||||
## Providers via `models.providers` (custom/base URL)
|
||||
|
||||
Use `models.providers` (or `models.json`) to add **custom** providers or
|
||||
OpenAI/Anthropic‑compatible proxies.
|
||||
|
||||
### MiniMax
|
||||
|
||||
MiniMax is configured via `models.providers` because it uses custom endpoints:
|
||||
|
||||
- MiniMax Cloud (OpenAI‑compatible): `--auth-choice minimax-cloud`
|
||||
- MiniMax API (Anthropic‑compatible): `--auth-choice minimax-api`
|
||||
- Auth: `MINIMAX_API_KEY`
|
||||
|
||||
### Local proxies (LM Studio, vLLM, LiteLLM, etc.)
|
||||
|
||||
Example (OpenAI‑compatible):
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: {
|
||||
defaults: {
|
||||
model: { primary: "lmstudio/minimax-m2.1-gs32" },
|
||||
models: { "lmstudio/minimax-m2.1-gs32": { alias: "Minimax" } }
|
||||
}
|
||||
},
|
||||
models: {
|
||||
providers: {
|
||||
lmstudio: {
|
||||
baseUrl: "http://localhost:1234/v1",
|
||||
apiKey: "LMSTUDIO_KEY",
|
||||
api: "openai-completions",
|
||||
models: [
|
||||
{
|
||||
id: "minimax-m2.1-gs32",
|
||||
name: "MiniMax M2.1",
|
||||
reasoning: false,
|
||||
input: ["text"],
|
||||
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
|
||||
contextWindow: 200000,
|
||||
maxTokens: 8192
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## CLI examples
|
||||
|
||||
```bash
|
||||
clawdbot onboard --auth-choice opencode-zen
|
||||
clawdbot models set opencode/claude-opus-4-5
|
||||
clawdbot models list
|
||||
```
|
||||
|
||||
See also: [/gateway/configuration](/gateway/configuration) for full configuration examples.
|
||||
@@ -9,6 +9,7 @@ read_when:
|
||||
|
||||
See [/concepts/model-failover](/concepts/model-failover) for auth profile
|
||||
rotation, cooldowns, and how that interacts with fallbacks.
|
||||
Quick provider overview + examples: [/concepts/model-providers](/concepts/model-providers).
|
||||
|
||||
## How model selection works
|
||||
|
||||
|
||||
@@ -580,6 +580,7 @@
|
||||
"group": "Install & Updates",
|
||||
"pages": [
|
||||
"install/updating",
|
||||
"install/ansible",
|
||||
"install/nix",
|
||||
"install/docker",
|
||||
"install/bun"
|
||||
@@ -589,7 +590,9 @@
|
||||
"group": "CLI",
|
||||
"pages": [
|
||||
"cli/index",
|
||||
"cli/message",
|
||||
"cli/gateway",
|
||||
"cli/update",
|
||||
"cli/sandbox"
|
||||
]
|
||||
},
|
||||
@@ -612,12 +615,16 @@
|
||||
"concepts/presence",
|
||||
"concepts/provider-routing",
|
||||
"concepts/messages",
|
||||
"concepts/streaming",
|
||||
"concepts/groups",
|
||||
"concepts/group-messages",
|
||||
"concepts/typing-indicators",
|
||||
"concepts/queue",
|
||||
"concepts/retry",
|
||||
"concepts/model-providers",
|
||||
"concepts/models",
|
||||
"concepts/model-failover",
|
||||
"concepts/usage-tracking",
|
||||
"concepts/timezone",
|
||||
"concepts/typebox"
|
||||
]
|
||||
@@ -628,6 +635,7 @@
|
||||
"gateway",
|
||||
"gateway/pairing",
|
||||
"gateway/gateway-lock",
|
||||
"environment",
|
||||
"gateway/configuration",
|
||||
"gateway/configuration-examples",
|
||||
"gateway/authentication",
|
||||
@@ -637,7 +645,10 @@
|
||||
"gateway/doctor",
|
||||
"gateway/logging",
|
||||
"gateway/security",
|
||||
"gateway/sandbox-vs-tool-policy-vs-elevated",
|
||||
"gateway/sandboxing",
|
||||
"gateway/troubleshooting",
|
||||
"debugging",
|
||||
"gateway/remote",
|
||||
"gateway/remote-gateway-readme",
|
||||
"gateway/discovery",
|
||||
@@ -659,12 +670,15 @@
|
||||
"group": "Providers",
|
||||
"pages": [
|
||||
"providers/whatsapp",
|
||||
"broadcast-groups",
|
||||
"providers/telegram",
|
||||
"providers/grammy",
|
||||
"providers/discord",
|
||||
"providers/slack",
|
||||
"providers/signal",
|
||||
"providers/imessage",
|
||||
"providers/msteams",
|
||||
"providers/troubleshooting",
|
||||
"providers/location"
|
||||
]
|
||||
},
|
||||
@@ -690,6 +704,8 @@
|
||||
"tools/thinking",
|
||||
"tools/agent-send",
|
||||
"tools/subagents",
|
||||
"multi-agent-sandbox-tools",
|
||||
"tools/reactions",
|
||||
"tools/skills",
|
||||
"tools/skills-config",
|
||||
"tools/clawdhub"
|
||||
|
||||
@@ -1423,6 +1423,7 @@ Clawdbot uses the **pi-coding-agent** model catalog. You can add custom provider
|
||||
(LiteLLM, local OpenAI-compatible servers, Anthropic proxies, etc.) by writing
|
||||
`~/.clawdbot/agents/<agentId>/agent/models.json` or by defining the same schema inside your
|
||||
Clawdbot config under `models.providers`.
|
||||
Provider-by-provider overview + examples: [/concepts/model-providers](/concepts/model-providers).
|
||||
|
||||
When `models.providers` is present, Clawdbot writes/merges a `models.json` into
|
||||
`~/.clawdbot/agents/<agentId>/agent/` on startup:
|
||||
@@ -1467,10 +1468,12 @@ Select the model via `agents.defaults.model.primary` (provider/model).
|
||||
|
||||
### OpenCode Zen (multi-model proxy)
|
||||
|
||||
OpenCode Zen is an OpenAI-compatible proxy at `https://opencode.ai/zen/v1`. Get an API key at https://opencode.ai/auth and set `OPENCODE_ZEN_API_KEY`.
|
||||
OpenCode Zen is a multi-model gateway with per-model endpoints. Clawdbot uses
|
||||
the built-in `opencode` provider from pi-ai; set `OPENCODE_API_KEY` (or
|
||||
`OPENCODE_ZEN_API_KEY`) from https://opencode.ai/auth.
|
||||
|
||||
Notes:
|
||||
- Model refs use `opencode-zen/<modelId>` (example: `opencode-zen/claude-opus-4-5`).
|
||||
- Model refs use `opencode/<modelId>` (example: `opencode/claude-opus-4-5`).
|
||||
- If you enable an allowlist via `agents.defaults.models`, add each model you plan to use.
|
||||
- Shortcut: `clawdbot onboard --auth-choice opencode-zen`.
|
||||
|
||||
@@ -1478,29 +1481,8 @@ Notes:
|
||||
{
|
||||
agents: {
|
||||
defaults: {
|
||||
model: { primary: "opencode-zen/claude-opus-4-5" },
|
||||
models: { "opencode-zen/claude-opus-4-5": { alias: "Opus" } }
|
||||
}
|
||||
},
|
||||
models: {
|
||||
mode: "merge",
|
||||
providers: {
|
||||
"opencode-zen": {
|
||||
baseUrl: "https://opencode.ai/zen/v1",
|
||||
apiKey: "${OPENCODE_ZEN_API_KEY}",
|
||||
api: "openai-completions",
|
||||
models: [
|
||||
{
|
||||
id: "claude-opus-4-5",
|
||||
name: "Claude Opus 4.5",
|
||||
reasoning: true,
|
||||
input: ["text", "image"],
|
||||
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
|
||||
contextWindow: 200000,
|
||||
maxTokens: 32000
|
||||
}
|
||||
]
|
||||
}
|
||||
model: { primary: "opencode/claude-opus-4-5" },
|
||||
models: { "opencode/claude-opus-4-5": { alias: "Opus" } }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -78,7 +78,7 @@ Tip: `--json` does **not** imply non-interactive mode. Use `--non-interactive` (
|
||||
- **OpenAI Code (Codex) subscription (OAuth)**: browser flow; paste the `code#state`.
|
||||
- Sets `agents.defaults.model` to `openai-codex/gpt-5.2` when model is unset or `openai/*`.
|
||||
- **OpenAI API key**: uses `OPENAI_API_KEY` if present or prompts for a key, then saves it to `~/.clawdbot/.env` so launchd can read it.
|
||||
- **OpenCode Zen (multi-model proxy)**: prompts for `OPENCODE_ZEN_API_KEY` (get it at https://opencode.ai/auth).
|
||||
- **OpenCode Zen (multi-model proxy)**: prompts for `OPENCODE_API_KEY` (or `OPENCODE_ZEN_API_KEY`, get it at https://opencode.ai/auth).
|
||||
- **API key**: stores the key for you.
|
||||
- **MiniMax M2.1 (minimax.io)**: config is auto‑written for the OpenAI-compatible `/v1` endpoint.
|
||||
- **MiniMax API (platform.minimax.io)**: config is auto‑written for the Anthropic-compatible `/anthropic` endpoint.
|
||||
@@ -205,7 +205,7 @@ OpenCode Zen example:
|
||||
clawdbot onboard --non-interactive \
|
||||
--mode local \
|
||||
--auth-choice opencode-zen \
|
||||
--opencode-zen-api-key "$OPENCODE_ZEN_API_KEY" \
|
||||
--opencode-zen-api-key "$OPENCODE_API_KEY" \
|
||||
--gateway-port 18789 \
|
||||
--gateway-bind loopback
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user