refactor: rename clawdbot to moltbot with legacy compat

This commit is contained in:
Peter Steinberger
2026-01-27 12:19:58 +00:00
parent 83460df96f
commit 6d16a658e5
1839 changed files with 11250 additions and 11199 deletions

View File

@@ -1,12 +1,12 @@
---
summary: "Run Clawdbot with Ollama (local LLM runtime)"
summary: "Run Moltbot with Ollama (local LLM runtime)"
read_when:
- You want to run Clawdbot with local models via Ollama
- You want to run Moltbot with local models via Ollama
- You need Ollama setup and configuration guidance
---
# Ollama
Ollama is a local LLM runtime that makes it easy to run open-source models on your machine. Clawdbot integrates with Ollama's OpenAI-compatible API and can **auto-discover tool-capable models** when you opt in with `OLLAMA_API_KEY` (or an auth profile) and do not define an explicit `models.providers.ollama` entry.
Ollama is a local LLM runtime that makes it easy to run open-source models on your machine. Moltbot integrates with Ollama's OpenAI-compatible API and can **auto-discover tool-capable models** when you opt in with `OLLAMA_API_KEY` (or an auth profile) and do not define an explicit `models.providers.ollama` entry.
## Quick start
@@ -22,14 +22,14 @@ ollama pull qwen2.5-coder:32b
ollama pull deepseek-r1:32b
```
3) Enable Ollama for Clawdbot (any value works; Ollama doesn't require a real key):
3) Enable Ollama for Moltbot (any value works; Ollama doesn't require a real key):
```bash
# Set environment variable
export OLLAMA_API_KEY="ollama-local"
# Or configure in your config file
clawdbot config set models.providers.ollama.apiKey "ollama-local"
moltbot config set models.providers.ollama.apiKey "ollama-local"
```
4) Use Ollama models:
@@ -46,7 +46,7 @@ clawdbot config set models.providers.ollama.apiKey "ollama-local"
## Model discovery (implicit provider)
When you set `OLLAMA_API_KEY` (or an auth profile) and **do not** define `models.providers.ollama`, Clawdbot discovers models from the local Ollama instance at `http://127.0.0.1:11434`:
When you set `OLLAMA_API_KEY` (or an auth profile) and **do not** define `models.providers.ollama`, Moltbot discovers models from the local Ollama instance at `http://127.0.0.1:11434`:
- Queries `/api/tags` and `/api/show`
- Keeps only models that report `tools` capability
@@ -61,7 +61,7 @@ To see what models are available:
```bash
ollama list
clawdbot models list
moltbot models list
```
To add a new model, simply pull it with Ollama:
@@ -117,7 +117,7 @@ Use explicit config when:
}
```
If `OLLAMA_API_KEY` is set, you can omit `apiKey` in the provider entry and Clawdbot will fill it for availability checks.
If `OLLAMA_API_KEY` is set, you can omit `apiKey` in the provider entry and Moltbot will fill it for availability checks.
### Custom base URL (explicit config)
@@ -157,7 +157,7 @@ Once configured, all your Ollama models are available:
### Reasoning models
Clawdbot marks models as reasoning-capable when Ollama reports `thinking` in `/api/show`:
Moltbot marks models as reasoning-capable when Ollama reports `thinking` in `/api/show`:
```bash
ollama pull deepseek-r1:32b
@@ -169,7 +169,7 @@ Ollama is free and runs locally, so all model costs are set to $0.
### Context windows
For auto-discovered models, Clawdbot uses the context window reported by Ollama when available, otherwise it defaults to `8192`. You can override `contextWindow` and `maxTokens` in explicit provider config.
For auto-discovered models, Moltbot uses the context window reported by Ollama when available, otherwise it defaults to `8192`. You can override `contextWindow` and `maxTokens` in explicit provider config.
## Troubleshooting
@@ -189,7 +189,7 @@ curl http://localhost:11434/api/tags
### No models available
Clawdbot only auto-discovers models that report tool support. If your model isn't listed, either:
Moltbot only auto-discovers models that report tool support. If your model isn't listed, either:
- Pull a tool-capable model, or
- Define the model explicitly in `models.providers.ollama`.