docs: make remote host examples generic
This commit is contained in:
@@ -92,7 +92,7 @@ The macOS app “Remote over SSH” mode uses a local port-forward so the remote
|
||||
CLI equivalent:
|
||||
|
||||
```bash
|
||||
clawdbot gateway status --ssh steipete@peters-mac-studio-1
|
||||
clawdbot gateway status --ssh user@gateway-host
|
||||
```
|
||||
|
||||
Options:
|
||||
|
||||
@@ -192,7 +192,7 @@ Offline-friendly alternatives (in increasing complexity):
|
||||
- SuCo (research-grade; attractive if there’s a solid implementation you can embed)
|
||||
|
||||
Open question:
|
||||
- what’s the **best** offline embedding model for “personal assistant memory” on your machines (MacBook + Castle)?
|
||||
- what’s the **best** offline embedding model for “personal assistant memory” on your machines (laptop + desktop)?
|
||||
- if you already have Ollama: embed with a local model; otherwise ship a small embedding model in the toolchain.
|
||||
|
||||
## Smallest useful pilot
|
||||
|
||||
@@ -1726,7 +1726,7 @@ Notes:
|
||||
|
||||
### Local models (LM Studio) — recommended setup
|
||||
|
||||
Best current local setup (what we’re running): **MiniMax M2.1** on a beefy Mac Studio
|
||||
Best current local setup (what we’re running): **MiniMax M2.1** on a powerful local machine
|
||||
via **LM Studio** using the **Responses API**.
|
||||
|
||||
```json5
|
||||
|
||||
@@ -11,7 +11,7 @@ Clawdbot.app uses SSH tunneling to connect to a remote gateway. This guide shows
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ MacBook │
|
||||
│ Client Machine │
|
||||
│ │
|
||||
│ Clawdbot.app ──► ws://127.0.0.1:18789 (local port) │
|
||||
│ │ │
|
||||
@@ -150,4 +150,4 @@ launchctl bootout gui/$UID/com.clawdbot.ssh-tunnel
|
||||
| `KeepAlive` | Automatically restarts tunnel if it crashes |
|
||||
| `RunAtLoad` | Starts tunnel when the agent loads |
|
||||
|
||||
Clawdbot.app connects to `ws://127.0.0.1:18789` on your MacBook. The SSH tunnel forwards that connection to port 18789 on the remote machine where the Gateway is running.
|
||||
Clawdbot.app connects to `ws://127.0.0.1:18789` on your client machine. The SSH tunnel forwards that connection to port 18789 on the remote machine where the Gateway is running.
|
||||
|
||||
@@ -5,7 +5,7 @@ read_when:
|
||||
---
|
||||
# Remote access (SSH, tunnels, and tailnets)
|
||||
|
||||
This repo supports “remote over SSH” by keeping a single Gateway (the master) running on a host (e.g., your Mac Studio) and connecting clients to it.
|
||||
This repo supports “remote over SSH” by keeping a single Gateway (the master) running on a dedicated host (desktop/server) and connecting clients to it.
|
||||
|
||||
- For **operators (you / the macOS app)**: SSH tunneling is the universal fallback.
|
||||
- For **nodes (iOS/Android and future devices)**: prefer the Gateway **Bridge** when on the same LAN/tailnet (see [Discovery](/gateway/discovery)).
|
||||
|
||||
@@ -6,7 +6,7 @@ read_when:
|
||||
# Remote Clawdbot (macOS ⇄ remote host)
|
||||
|
||||
|
||||
This flow lets the macOS app act as a full remote control for a Clawdbot gateway running on another host (e.g. a Mac Studio). All features—health checks, Voice Wake forwarding, and Web Chat—reuse the same remote SSH configuration from *Settings → General*.
|
||||
This flow lets the macOS app act as a full remote control for a Clawdbot gateway running on another host (desktop/server). All features—health checks, Voice Wake forwarding, and Web Chat—reuse the same remote SSH configuration from *Settings → General*.
|
||||
|
||||
## Modes
|
||||
- **Local (this Mac)**: Everything runs on the laptop. No SSH involved.
|
||||
|
||||
@@ -22,6 +22,12 @@ Quick mental model:
|
||||
- Configure under `plugins.entries.voice-call.config`
|
||||
- Use `clawdbot voicecall …` or the `voice_call` tool
|
||||
|
||||
## Where it runs (local vs remote)
|
||||
|
||||
The Voice Call plugin runs **inside the Gateway process**.
|
||||
|
||||
If you use a remote Gateway, install/configure the plugin on the **machine running the Gateway**, then restart the Gateway to load it.
|
||||
|
||||
## Install
|
||||
|
||||
### Option A: install from npm (recommended)
|
||||
|
||||
@@ -123,7 +123,7 @@ Configure via CLI:
|
||||
|
||||
**Best for:** local inference with LM Studio.
|
||||
We have seen strong results with MiniMax M2.1 on powerful hardware (e.g. a
|
||||
beefy Mac Studio) using LM Studio's local server.
|
||||
desktop/server) using LM Studio's local server.
|
||||
|
||||
Configure via CLI:
|
||||
- Run `clawdbot configure`
|
||||
|
||||
Reference in New Issue
Block a user