5.6 KiB
5.6 KiB
summary, read_when
| summary | read_when | ||
|---|---|---|---|
| Use MiniMax M2.1 in Clawdbot (cloud, API, or LM Studio) |
|
MiniMax
MiniMax is an AI company that builds the M2/M2.1 model family. The current coding-focused release is MiniMax M2.1 (December 23, 2025), built for real-world complex tasks.
Source: MiniMax M2.1 release note
Model overview (M2.1)
MiniMax highlights these improvements in M2.1:
- Stronger multi-language coding (Rust, Java, Go, C++, Kotlin, Objective-C, TS/JS).
- Better web/app development and aesthetic output quality (including native mobile).
- Improved composite instruction handling for office-style workflows, building on interleaved thinking and integrated constraint execution.
- More concise responses with lower token usage and faster iteration loops.
- Stronger tool/agent framework compatibility and context management (Claude Code, Droid/Factory AI, Cline, Kilo Code, Roo Code, BlackBox).
- Higher-quality dialogue and technical writing outputs.
Choose a setup
Option A: MiniMax Cloud (OpenAI-compatible /v1)
Best for: hosted MiniMax with OpenAI-compatible API.
Configure via CLI:
- Run
clawdbot configure - Select Model/auth
- Choose MiniMax M2.1 (minimax.io)
{
env: { MINIMAX_API_KEY: "sk-..." },
agents: { defaults: { model: { primary: "minimax/MiniMax-M2.1" } } },
models: {
mode: "merge",
providers: {
minimax: {
baseUrl: "https://api.minimax.io/v1",
apiKey: "${MINIMAX_API_KEY}",
api: "openai-completions",
models: [
{
id: "MiniMax-M2.1",
name: "MiniMax M2.1",
reasoning: false,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 200000,
maxTokens: 8192
}
]
}
}
}
}
Option B: MiniMax API (Anthropic-compatible /anthropic)
Best for: MiniMax's Anthropic-compatible API (platform.minimax.io).
Configure via CLI:
- Run
clawdbot configure - Select Model/auth
- Choose MiniMax API (platform.minimax.io)
{
env: { MINIMAX_API_KEY: "sk-..." },
agents: { defaults: { model: { primary: "minimax/MiniMax-M2.1" } } },
models: {
mode: "merge",
providers: {
minimax: {
baseUrl: "https://api.minimax.io/anthropic",
apiKey: "${MINIMAX_API_KEY}",
api: "anthropic-messages",
models: [
{
id: "MiniMax-M2.1",
name: "MiniMax M2.1",
reasoning: false,
input: ["text"],
cost: { input: 15, output: 60, cacheRead: 2, cacheWrite: 10 },
contextWindow: 200000,
maxTokens: 8192
},
{
id: "MiniMax-M2.1-lightning",
name: "MiniMax M2.1 Lightning",
reasoning: false,
input: ["text"],
cost: { input: 15, output: 60, cacheRead: 2, cacheWrite: 10 },
contextWindow: 200000,
maxTokens: 8192
},
{
id: "MiniMax-M2",
name: "MiniMax M2",
reasoning: true,
input: ["text"],
cost: { input: 15, output: 60, cacheRead: 2, cacheWrite: 10 },
contextWindow: 200000,
maxTokens: 8192
}
]
}
}
}
}
Option C: Local via LM Studio
Best for: local inference with LM Studio. We have seen strong results with MiniMax M2.1 on powerful hardware (e.g. a beefy Mac Studio) using LM Studio's local server.
Configure via CLI:
- Run
clawdbot configure - Select Model/auth
- Choose MiniMax M2.1 (LM Studio)
{
agents: {
defaults: {
model: { primary: "lmstudio/minimax-m2.1-gs32" },
models: { "lmstudio/minimax-m2.1-gs32": { alias: "Minimax" } }
}
},
models: {
mode: "merge",
providers: {
lmstudio: {
baseUrl: "http://127.0.0.1:1234/v1",
apiKey: "lmstudio",
api: "openai-responses",
models: [
{
id: "minimax-m2.1-gs32",
name: "MiniMax M2.1 GS32",
reasoning: false,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 196608,
maxTokens: 8192
}
]
}
}
}
}
Configure via clawdbot configure
Use the interactive config wizard to set MiniMax without editing JSON:
- Run
clawdbot configure. - Select Model/auth.
- Choose MiniMax M2.1 (minimax.io), MiniMax API (platform.minimax.io), or MiniMax M2.1 (LM Studio).
- Pick your default model when prompted.
Configuration options
models.providers.minimax.baseUrl:https://api.minimax.io/v1orhttps://api.minimax.io/anthropic.models.providers.minimax.api:openai-completions(cloud) oranthropic-messages(API).models.providers.minimax.apiKey: MiniMax API key (MINIMAX_API_KEY).models.providers.minimax.models: defineid,name,reasoning,contextWindow,maxTokens,cost.agents.defaults.models: alias models you want in the allowlist.models.mode: keepmergeif you want to add MiniMax alongside built-ins.
Notes
- Model refs are
minimax/<model>orlmstudio/<model>. - MiniMax pricing is not published; the costs above are placeholders.
Override in
models.jsonfor accurate tracking. - See /concepts/model-providers for provider rules.
- Use
clawdbot models listandclawdbot models set minimax/MiniMax-M2.1to switch.