feat: add per-session agent sandbox
This commit is contained in:
@@ -16,6 +16,7 @@
|
|||||||
- Config: expose schema + UI hints for generic config forms (Web UI + future clients).
|
- Config: expose schema + UI hints for generic config forms (Web UI + future clients).
|
||||||
- Skills: add blogwatcher skill for RSS/Atom monitoring — thanks @Hyaxia.
|
- Skills: add blogwatcher skill for RSS/Atom monitoring — thanks @Hyaxia.
|
||||||
- Discord: emit system events for reaction add/remove with per-guild reaction notifications (off|own|all|allowlist) (#140) — thanks @thewilloftheshadow.
|
- Discord: emit system events for reaction add/remove with per-guild reaction notifications (off|own|all|allowlist) (#140) — thanks @thewilloftheshadow.
|
||||||
|
- Agent: add optional per-session Docker sandbox for tool execution (`agent.sandbox`) with allow/deny policy and auto-pruning.
|
||||||
|
|
||||||
### Fixes
|
### Fixes
|
||||||
- Auto-reply: drop final payloads when block streaming to avoid duplicate Discord sends.
|
- Auto-reply: drop final payloads when block streaming to avoid duplicate Discord sends.
|
||||||
@@ -53,6 +54,7 @@
|
|||||||
- Gateway: document config hot reload + reload matrix.
|
- Gateway: document config hot reload + reload matrix.
|
||||||
- Onboarding/Config: add protocol notes for wizard + schema RPC.
|
- Onboarding/Config: add protocol notes for wizard + schema RPC.
|
||||||
- Queue: clarify steer-backlog behavior with inline commands and update examples for streaming surfaces.
|
- Queue: clarify steer-backlog behavior with inline commands and update examples for streaming surfaces.
|
||||||
|
- Sandbox: document per-session agent sandbox setup, config, and Docker build.
|
||||||
|
|
||||||
## 2.0.0-beta5 — 2026-01-03
|
## 2.0.0-beta5 — 2026-01-03
|
||||||
|
|
||||||
|
|||||||
16
Dockerfile.sandbox
Normal file
16
Dockerfile.sandbox
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
FROM debian:bookworm-slim
|
||||||
|
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
|
||||||
|
RUN apt-get update \
|
||||||
|
&& apt-get install -y --no-install-recommends \
|
||||||
|
bash \
|
||||||
|
ca-certificates \
|
||||||
|
curl \
|
||||||
|
git \
|
||||||
|
jq \
|
||||||
|
python3 \
|
||||||
|
ripgrep \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
CMD ["sleep", "infinity"]
|
||||||
@@ -33,6 +33,11 @@ better forms without hard-coding config knowledge.
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Build the default image once with:
|
||||||
|
```bash
|
||||||
|
scripts/sandbox-setup.sh
|
||||||
|
```
|
||||||
|
|
||||||
## Self-chat mode (recommended for group control)
|
## Self-chat mode (recommended for group control)
|
||||||
|
|
||||||
To prevent the bot from responding to WhatsApp @-mentions in groups (only respond to specific text triggers):
|
To prevent the bot from responding to WhatsApp @-mentions in groups (only respond to specific text triggers):
|
||||||
@@ -323,6 +328,9 @@ Default: `~/clawd`.
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
If `agent.sandbox` is enabled, non-main sessions can override this with their
|
||||||
|
own per-session workspaces under `agent.sandbox.workspaceRoot`.
|
||||||
|
|
||||||
### `messages`
|
### `messages`
|
||||||
|
|
||||||
Controls inbound/outbound prefixes and timestamps.
|
Controls inbound/outbound prefixes and timestamps.
|
||||||
@@ -435,6 +443,50 @@ Z.AI models are available as `zai/<model>` (e.g. `zai/glm-4.7`) and require
|
|||||||
execute in parallel across sessions. Each session is still serialized (one run
|
execute in parallel across sessions. Each session is still serialized (one run
|
||||||
per session key at a time). Default: 1.
|
per session key at a time). Default: 1.
|
||||||
|
|
||||||
|
### `agent.sandbox`
|
||||||
|
|
||||||
|
Optional per-session **Docker sandboxing** for the embedded agent. Intended for
|
||||||
|
non-main sessions so they cannot access your host system.
|
||||||
|
|
||||||
|
Defaults (if enabled):
|
||||||
|
- one container per session
|
||||||
|
- Debian bookworm-slim based image
|
||||||
|
- workspace per session under `~/.clawdis/sandboxes`
|
||||||
|
- auto-prune: idle > 24h OR age > 7d
|
||||||
|
- tools: allow only `bash`, `process`, `read`, `write`, `edit` (deny wins)
|
||||||
|
|
||||||
|
```json5
|
||||||
|
{
|
||||||
|
agent: {
|
||||||
|
sandbox: {
|
||||||
|
mode: "non-main", // off | non-main | all
|
||||||
|
perSession: true,
|
||||||
|
workspaceRoot: "~/.clawdis/sandboxes",
|
||||||
|
docker: {
|
||||||
|
image: "clawdis-sandbox:bookworm-slim",
|
||||||
|
containerPrefix: "clawdis-sbx-",
|
||||||
|
workdir: "/workspace",
|
||||||
|
readOnlyRoot: true,
|
||||||
|
tmpfs: ["/tmp", "/var/tmp", "/run"],
|
||||||
|
network: "bridge",
|
||||||
|
user: "1000:1000",
|
||||||
|
capDrop: ["ALL"],
|
||||||
|
env: { LANG: "C.UTF-8" },
|
||||||
|
setupCommand: "apt-get update && apt-get install -y git curl jq"
|
||||||
|
},
|
||||||
|
tools: {
|
||||||
|
allow: ["bash", "process", "read", "write", "edit"],
|
||||||
|
deny: ["browser", "canvas", "nodes", "cron", "discord", "gateway"]
|
||||||
|
},
|
||||||
|
prune: {
|
||||||
|
idleHours: 24, // 0 disables idle pruning
|
||||||
|
maxAgeDays: 7 // 0 disables max-age pruning
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
### `models` (custom providers + base URLs)
|
### `models` (custom providers + base URLs)
|
||||||
|
|
||||||
Clawdis uses the **pi-coding-agent** model catalog. You can add custom providers
|
Clawdis uses the **pi-coding-agent** model catalog. You can add custom providers
|
||||||
|
|||||||
139
docs/docker.md
139
docs/docker.md
@@ -9,16 +9,27 @@ read_when:
|
|||||||
|
|
||||||
Docker is **optional**. Use it only if you want a containerized gateway or to validate the Docker flow.
|
Docker is **optional**. Use it only if you want a containerized gateway or to validate the Docker flow.
|
||||||
|
|
||||||
## Quick start (recommended)
|
This guide covers:
|
||||||
|
- Containerized Gateway (full Clawdis in Docker)
|
||||||
|
- Per-session Agent Sandbox (host gateway + Docker-isolated agent tools)
|
||||||
|
|
||||||
From the repo root:
|
## Requirements
|
||||||
|
|
||||||
|
- Docker Desktop (or Docker Engine) + Docker Compose v2
|
||||||
|
- Enough disk for images + logs
|
||||||
|
|
||||||
|
## Containerized Gateway (Docker Compose)
|
||||||
|
|
||||||
|
### Quick start (recommended)
|
||||||
|
|
||||||
|
From repo root:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
./docker-setup.sh
|
./docker-setup.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
This script:
|
This script:
|
||||||
- builds the image
|
- builds the gateway image
|
||||||
- runs the onboarding wizard
|
- runs the onboarding wizard
|
||||||
- runs WhatsApp login
|
- runs WhatsApp login
|
||||||
- starts the gateway via Docker Compose
|
- starts the gateway via Docker Compose
|
||||||
@@ -27,7 +38,7 @@ It writes config/workspace on the host:
|
|||||||
- `~/.clawdis/`
|
- `~/.clawdis/`
|
||||||
- `~/clawd`
|
- `~/clawd`
|
||||||
|
|
||||||
## Manual flow (compose)
|
### Manual flow (compose)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker build -t clawdis:local -f Dockerfile .
|
docker build -t clawdis:local -f Dockerfile .
|
||||||
@@ -36,14 +47,126 @@ docker compose run --rm clawdis-cli login
|
|||||||
docker compose up -d clawdis-gateway
|
docker compose up -d clawdis-gateway
|
||||||
```
|
```
|
||||||
|
|
||||||
## E2E smoke test (Docker)
|
### Health check
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose exec clawdis-gateway node dist/index.js health --token "$CLAWDIS_GATEWAY_TOKEN"
|
||||||
|
```
|
||||||
|
|
||||||
|
### E2E smoke test (Docker)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
scripts/e2e/onboard-docker.sh
|
scripts/e2e/onboard-docker.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
## Notes
|
### Notes
|
||||||
|
|
||||||
- Gateway bind defaults to `lan` for container use.
|
- Gateway bind defaults to `lan` for container use.
|
||||||
- Health check:
|
- The gateway container is the source of truth for sessions (`~/.clawdis/sessions`).
|
||||||
`docker compose exec clawdis-gateway node dist/index.js health --token "$CLAWDIS_GATEWAY_TOKEN"`
|
|
||||||
|
## Per-session Agent Sandbox (host gateway + Docker tools)
|
||||||
|
|
||||||
|
### What it does
|
||||||
|
|
||||||
|
When `agent.sandbox` is enabled, **non-main sessions** run tools inside a Docker
|
||||||
|
container. The gateway stays on your host, but the tool execution is isolated:
|
||||||
|
- one container per session (hard wall)
|
||||||
|
- per-session workspace folder mounted at `/workspace`
|
||||||
|
- allow/deny tool policy (deny wins)
|
||||||
|
|
||||||
|
### Default behavior
|
||||||
|
|
||||||
|
- Image: `clawdis-sandbox:bookworm-slim`
|
||||||
|
- One container per session
|
||||||
|
- Workspace per session under `~/.clawdis/sandboxes`
|
||||||
|
- Auto-prune: idle > 24h OR age > 7d
|
||||||
|
- Default allow: `bash`, `process`, `read`, `write`, `edit`
|
||||||
|
- Default deny: `browser`, `canvas`, `nodes`, `cron`, `discord`, `gateway`
|
||||||
|
|
||||||
|
### Enable sandboxing
|
||||||
|
|
||||||
|
```json5
|
||||||
|
{
|
||||||
|
agent: {
|
||||||
|
sandbox: {
|
||||||
|
mode: "non-main", // off | non-main | all
|
||||||
|
perSession: true,
|
||||||
|
workspaceRoot: "~/.clawdis/sandboxes",
|
||||||
|
docker: {
|
||||||
|
image: "clawdis-sandbox:bookworm-slim",
|
||||||
|
workdir: "/workspace",
|
||||||
|
readOnlyRoot: true,
|
||||||
|
tmpfs: ["/tmp", "/var/tmp", "/run"],
|
||||||
|
network: "bridge",
|
||||||
|
user: "1000:1000",
|
||||||
|
capDrop: ["ALL"],
|
||||||
|
env: { LANG: "C.UTF-8" },
|
||||||
|
setupCommand: "apt-get update && apt-get install -y git curl jq"
|
||||||
|
},
|
||||||
|
tools: {
|
||||||
|
allow: ["bash", "process", "read", "write", "edit"],
|
||||||
|
deny: ["browser", "canvas", "nodes", "cron", "discord", "gateway"]
|
||||||
|
},
|
||||||
|
prune: {
|
||||||
|
idleHours: 24, // 0 disables idle pruning
|
||||||
|
maxAgeDays: 7 // 0 disables max-age pruning
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Build the default sandbox image
|
||||||
|
|
||||||
|
```bash
|
||||||
|
scripts/sandbox-setup.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This builds `clawdis-sandbox:bookworm-slim` using `Dockerfile.sandbox`.
|
||||||
|
|
||||||
|
### Custom sandbox image
|
||||||
|
|
||||||
|
Build your own image and point config to it:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker build -t my-clawdis-sbx -f Dockerfile.sandbox .
|
||||||
|
```
|
||||||
|
|
||||||
|
```json5
|
||||||
|
{
|
||||||
|
agent: {
|
||||||
|
sandbox: { docker: { image: "my-clawdis-sbx" } }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tool policy (allow/deny)
|
||||||
|
|
||||||
|
- `deny` wins over `allow`.
|
||||||
|
- If `allow` is empty: all tools (except deny) are available.
|
||||||
|
- If `allow` is non-empty: only tools in `allow` are available (minus deny).
|
||||||
|
|
||||||
|
### Pruning strategy
|
||||||
|
|
||||||
|
Two knobs:
|
||||||
|
- `prune.idleHours`: remove containers not used in X hours (0 = disable)
|
||||||
|
- `prune.maxAgeDays`: remove containers older than X days (0 = disable)
|
||||||
|
|
||||||
|
Example:
|
||||||
|
- Keep busy sessions but cap lifetime:
|
||||||
|
`idleHours: 24`, `maxAgeDays: 7`
|
||||||
|
- Never prune:
|
||||||
|
`idleHours: 0`, `maxAgeDays: 0`
|
||||||
|
|
||||||
|
### Security notes
|
||||||
|
|
||||||
|
- Hard wall only applies to **tools** (bash/read/write/edit).
|
||||||
|
- Host-only tools like browser/camera/canvas are blocked by default.
|
||||||
|
- Allowing `browser` in sandbox **breaks isolation** (browser runs on host).
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
- Image missing: build with `scripts/sandbox-setup.sh` or set `agent.sandbox.docker.image`.
|
||||||
|
- Container not running: it will auto-create per session on demand.
|
||||||
|
- Permission errors in sandbox: set `docker.user` to a UID:GID that matches your
|
||||||
|
mounted workspace ownership (or chown the workspace folder).
|
||||||
|
|||||||
@@ -99,6 +99,12 @@ services:
|
|||||||
network_mode: bridge # Limited network
|
network_mode: bridge # Limited network
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Per-session sandbox (Clawdis-native)
|
||||||
|
|
||||||
|
Clawdis can also run **non-main sessions** inside per-session Docker containers
|
||||||
|
(`agent.sandbox`). This keeps the gateway on your host while isolating agent
|
||||||
|
tools in a hard wall container. See `docs/configuration.md` for the full config.
|
||||||
|
|
||||||
Expose only the services your AI needs:
|
Expose only the services your AI needs:
|
||||||
- ✅ GoWA API (for WhatsApp)
|
- ✅ GoWA API (for WhatsApp)
|
||||||
- ✅ Specific HTTP APIs
|
- ✅ Specific HTTP APIs
|
||||||
|
|||||||
7
scripts/sandbox-setup.sh
Executable file
7
scripts/sandbox-setup.sh
Executable file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
IMAGE_NAME="clawdis-sandbox:bookworm-slim"
|
||||||
|
|
||||||
|
docker build -t "${IMAGE_NAME}" -f Dockerfile.sandbox .
|
||||||
|
echo "Built ${IMAGE_NAME}"
|
||||||
@@ -1,6 +1,7 @@
|
|||||||
import { type ChildProcessWithoutNullStreams, spawn } from "node:child_process";
|
import { type ChildProcessWithoutNullStreams, spawn } from "node:child_process";
|
||||||
import { randomUUID } from "node:crypto";
|
import { randomUUID } from "node:crypto";
|
||||||
import { existsSync, statSync } from "node:fs";
|
import { existsSync, statSync } from "node:fs";
|
||||||
|
import fs from "node:fs/promises";
|
||||||
import { homedir } from "node:os";
|
import { homedir } from "node:os";
|
||||||
import type { AgentTool, AgentToolResult } from "@mariozechner/pi-agent-core";
|
import type { AgentTool, AgentToolResult } from "@mariozechner/pi-agent-core";
|
||||||
import { Type } from "@sinclair/typebox";
|
import { Type } from "@sinclair/typebox";
|
||||||
@@ -18,6 +19,7 @@ import {
|
|||||||
markExited,
|
markExited,
|
||||||
setJobTtlMs,
|
setJobTtlMs,
|
||||||
} from "./bash-process-registry.js";
|
} from "./bash-process-registry.js";
|
||||||
|
import { assertSandboxPath } from "./sandbox-paths.js";
|
||||||
import {
|
import {
|
||||||
getShellConfig,
|
getShellConfig,
|
||||||
killProcessTree,
|
killProcessTree,
|
||||||
@@ -47,12 +49,20 @@ const stringEnum = (
|
|||||||
export type BashToolDefaults = {
|
export type BashToolDefaults = {
|
||||||
backgroundMs?: number;
|
backgroundMs?: number;
|
||||||
timeoutSec?: number;
|
timeoutSec?: number;
|
||||||
|
sandbox?: BashSandboxConfig;
|
||||||
};
|
};
|
||||||
|
|
||||||
export type ProcessToolDefaults = {
|
export type ProcessToolDefaults = {
|
||||||
cleanupMs?: number;
|
cleanupMs?: number;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export type BashSandboxConfig = {
|
||||||
|
containerName: string;
|
||||||
|
workspaceDir: string;
|
||||||
|
containerWorkdir: string;
|
||||||
|
env?: Record<string, string>;
|
||||||
|
};
|
||||||
|
|
||||||
const bashSchema = Type.Object({
|
const bashSchema = Type.Object({
|
||||||
command: Type.String({ description: "Bash command to execute" }),
|
command: Type.String({ description: "Bash command to execute" }),
|
||||||
workdir: Type.Optional(
|
workdir: Type.Optional(
|
||||||
@@ -136,19 +146,55 @@ export function createBashTool(
|
|||||||
const startedAt = Date.now();
|
const startedAt = Date.now();
|
||||||
const sessionId = randomUUID();
|
const sessionId = randomUUID();
|
||||||
const warnings: string[] = [];
|
const warnings: string[] = [];
|
||||||
const workdir = resolveWorkdir(
|
const sandbox = defaults?.sandbox;
|
||||||
params.workdir?.trim() || process.cwd(),
|
const rawWorkdir = params.workdir?.trim() || process.cwd();
|
||||||
warnings,
|
let workdir = rawWorkdir;
|
||||||
);
|
let containerWorkdir = sandbox?.containerWorkdir;
|
||||||
|
if (sandbox) {
|
||||||
|
const resolved = await resolveSandboxWorkdir({
|
||||||
|
workdir: rawWorkdir,
|
||||||
|
sandbox,
|
||||||
|
warnings,
|
||||||
|
});
|
||||||
|
workdir = resolved.hostWorkdir;
|
||||||
|
containerWorkdir = resolved.containerWorkdir;
|
||||||
|
} else {
|
||||||
|
workdir = resolveWorkdir(rawWorkdir, warnings);
|
||||||
|
}
|
||||||
|
|
||||||
const { shell, args: shellArgs } = getShellConfig();
|
const { shell, args: shellArgs } = getShellConfig();
|
||||||
const env = params.env ? { ...process.env, ...params.env } : process.env;
|
const baseEnv = coerceEnv(process.env);
|
||||||
const child = spawn(shell, [...shellArgs, params.command], {
|
const mergedEnv = params.env ? { ...baseEnv, ...params.env } : baseEnv;
|
||||||
cwd: workdir,
|
const env = sandbox
|
||||||
env,
|
? buildSandboxEnv({
|
||||||
detached: true,
|
paramsEnv: params.env,
|
||||||
stdio: ["pipe", "pipe", "pipe"],
|
sandboxEnv: sandbox.env,
|
||||||
});
|
containerWorkdir: containerWorkdir ?? sandbox.containerWorkdir,
|
||||||
|
})
|
||||||
|
: mergedEnv;
|
||||||
|
const child = sandbox
|
||||||
|
? spawn(
|
||||||
|
"docker",
|
||||||
|
buildDockerExecArgs({
|
||||||
|
containerName: sandbox.containerName,
|
||||||
|
command: params.command,
|
||||||
|
workdir: containerWorkdir ?? sandbox.containerWorkdir,
|
||||||
|
env,
|
||||||
|
tty: false,
|
||||||
|
}),
|
||||||
|
{
|
||||||
|
cwd: workdir,
|
||||||
|
env: process.env,
|
||||||
|
detached: true,
|
||||||
|
stdio: ["pipe", "pipe", "pipe"],
|
||||||
|
},
|
||||||
|
)
|
||||||
|
: spawn(shell, [...shellArgs, params.command], {
|
||||||
|
cwd: workdir,
|
||||||
|
env,
|
||||||
|
detached: true,
|
||||||
|
stdio: ["pipe", "pipe", "pipe"],
|
||||||
|
});
|
||||||
|
|
||||||
const session = {
|
const session = {
|
||||||
id: sessionId,
|
id: sessionId,
|
||||||
@@ -776,6 +822,86 @@ export function createProcessTool(
|
|||||||
|
|
||||||
export const processTool = createProcessTool();
|
export const processTool = createProcessTool();
|
||||||
|
|
||||||
|
function buildSandboxEnv(params: {
|
||||||
|
paramsEnv?: Record<string, string>;
|
||||||
|
sandboxEnv?: Record<string, string>;
|
||||||
|
containerWorkdir: string;
|
||||||
|
}) {
|
||||||
|
const env: Record<string, string> = {
|
||||||
|
PATH: DEFAULT_PATH,
|
||||||
|
HOME: params.containerWorkdir,
|
||||||
|
};
|
||||||
|
for (const [key, value] of Object.entries(params.sandboxEnv ?? {})) {
|
||||||
|
env[key] = value;
|
||||||
|
}
|
||||||
|
for (const [key, value] of Object.entries(params.paramsEnv ?? {})) {
|
||||||
|
env[key] = value;
|
||||||
|
}
|
||||||
|
return env;
|
||||||
|
}
|
||||||
|
|
||||||
|
function coerceEnv(env?: NodeJS.ProcessEnv | Record<string, string>) {
|
||||||
|
const record: Record<string, string> = {};
|
||||||
|
if (!env) return record;
|
||||||
|
for (const [key, value] of Object.entries(env)) {
|
||||||
|
if (typeof value === "string") record[key] = value;
|
||||||
|
}
|
||||||
|
return record;
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildDockerExecArgs(params: {
|
||||||
|
containerName: string;
|
||||||
|
command: string;
|
||||||
|
workdir?: string;
|
||||||
|
env: Record<string, string>;
|
||||||
|
tty: boolean;
|
||||||
|
}) {
|
||||||
|
const args = ["exec", "-i"];
|
||||||
|
if (params.tty) args.push("-t");
|
||||||
|
if (params.workdir) {
|
||||||
|
args.push("-w", params.workdir);
|
||||||
|
}
|
||||||
|
for (const [key, value] of Object.entries(params.env)) {
|
||||||
|
args.push("-e", `${key}=${value}`);
|
||||||
|
}
|
||||||
|
args.push(params.containerName, "sh", "-lc", params.command);
|
||||||
|
return args;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function resolveSandboxWorkdir(params: {
|
||||||
|
workdir: string;
|
||||||
|
sandbox: BashSandboxConfig;
|
||||||
|
warnings: string[];
|
||||||
|
}) {
|
||||||
|
const fallback = params.sandbox.workspaceDir;
|
||||||
|
try {
|
||||||
|
const resolved = await assertSandboxPath({
|
||||||
|
filePath: params.workdir,
|
||||||
|
cwd: process.cwd(),
|
||||||
|
root: params.sandbox.workspaceDir,
|
||||||
|
});
|
||||||
|
const stats = await fs.stat(resolved.resolved);
|
||||||
|
if (!stats.isDirectory()) {
|
||||||
|
throw new Error("workdir is not a directory");
|
||||||
|
}
|
||||||
|
const relative = resolved.relative
|
||||||
|
? resolved.relative.split(path.sep).join(path.posix.sep)
|
||||||
|
: "";
|
||||||
|
const containerWorkdir = relative
|
||||||
|
? path.posix.join(params.sandbox.containerWorkdir, relative)
|
||||||
|
: params.sandbox.containerWorkdir;
|
||||||
|
return { hostWorkdir: resolved.resolved, containerWorkdir };
|
||||||
|
} catch {
|
||||||
|
params.warnings.push(
|
||||||
|
`Warning: workdir "${params.workdir}" is unavailable; using "${fallback}".`,
|
||||||
|
);
|
||||||
|
return {
|
||||||
|
hostWorkdir: fallback,
|
||||||
|
containerWorkdir: params.sandbox.containerWorkdir,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function killSession(session: {
|
function killSession(session: {
|
||||||
pid?: number;
|
pid?: number;
|
||||||
child?: ChildProcessWithoutNullStreams;
|
child?: ChildProcessWithoutNullStreams;
|
||||||
|
|||||||
@@ -158,4 +158,33 @@ describe("createClawdisCodingTools", () => {
|
|||||||
await fs.rm(tmpDir, { recursive: true, force: true });
|
await fs.rm(tmpDir, { recursive: true, force: true });
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("filters tools by sandbox policy", () => {
|
||||||
|
const sandbox = {
|
||||||
|
enabled: true,
|
||||||
|
sessionKey: "sandbox:test",
|
||||||
|
workspaceDir: path.join(os.tmpdir(), "clawdis-sandbox"),
|
||||||
|
containerName: "clawdis-sbx-test",
|
||||||
|
containerWorkdir: "/workspace",
|
||||||
|
docker: {
|
||||||
|
image: "clawdis-sandbox:bookworm-slim",
|
||||||
|
containerPrefix: "clawdis-sbx-",
|
||||||
|
workdir: "/workspace",
|
||||||
|
readOnlyRoot: true,
|
||||||
|
tmpfs: [],
|
||||||
|
network: "none",
|
||||||
|
user: "1000:1000",
|
||||||
|
capDrop: ["ALL"],
|
||||||
|
env: { LANG: "C.UTF-8" },
|
||||||
|
},
|
||||||
|
tools: {
|
||||||
|
allow: ["bash"],
|
||||||
|
deny: ["browser"],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
const tools = createClawdisCodingTools({ sandbox });
|
||||||
|
expect(tools.some((tool) => tool.name === "bash")).toBe(true);
|
||||||
|
expect(tools.some((tool) => tool.name === "read")).toBe(false);
|
||||||
|
expect(tools.some((tool) => tool.name === "browser")).toBe(false);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,5 +1,11 @@
|
|||||||
import type { AgentTool, AgentToolResult } from "@mariozechner/pi-agent-core";
|
import type { AgentTool, AgentToolResult } from "@mariozechner/pi-agent-core";
|
||||||
import { codingTools, readTool } from "@mariozechner/pi-coding-agent";
|
import {
|
||||||
|
codingTools,
|
||||||
|
createEditTool,
|
||||||
|
createReadTool,
|
||||||
|
createWriteTool,
|
||||||
|
readTool,
|
||||||
|
} from "@mariozechner/pi-coding-agent";
|
||||||
import { Type } from "@sinclair/typebox";
|
import { Type } from "@sinclair/typebox";
|
||||||
|
|
||||||
import { detectMime } from "../media/mime.js";
|
import { detectMime } from "../media/mime.js";
|
||||||
@@ -11,6 +17,8 @@ import {
|
|||||||
type ProcessToolDefaults,
|
type ProcessToolDefaults,
|
||||||
} from "./bash-tools.js";
|
} from "./bash-tools.js";
|
||||||
import { createClawdisTools } from "./clawdis-tools.js";
|
import { createClawdisTools } from "./clawdis-tools.js";
|
||||||
|
import type { SandboxContext, SandboxToolPolicy } from "./sandbox.js";
|
||||||
|
import { assertSandboxPath } from "./sandbox-paths.js";
|
||||||
import { sanitizeToolResultImages } from "./tool-images.js";
|
import { sanitizeToolResultImages } from "./tool-images.js";
|
||||||
|
|
||||||
// NOTE(steipete): Upstream read now does file-magic MIME detection; we keep the wrapper
|
// NOTE(steipete): Upstream read now does file-magic MIME detection; we keep the wrapper
|
||||||
@@ -284,6 +292,59 @@ function normalizeToolParameters(tool: AnyAgentTool): AnyAgentTool {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function normalizeToolNames(list?: string[]) {
|
||||||
|
if (!list) return [];
|
||||||
|
return list.map((entry) => entry.trim().toLowerCase()).filter(Boolean);
|
||||||
|
}
|
||||||
|
|
||||||
|
function filterToolsByPolicy(
|
||||||
|
tools: AnyAgentTool[],
|
||||||
|
policy?: SandboxToolPolicy,
|
||||||
|
) {
|
||||||
|
if (!policy) return tools;
|
||||||
|
const deny = new Set(normalizeToolNames(policy.deny));
|
||||||
|
const allowRaw = normalizeToolNames(policy.allow);
|
||||||
|
const allow = allowRaw.length > 0 ? new Set(allowRaw) : null;
|
||||||
|
return tools.filter((tool) => {
|
||||||
|
const name = tool.name.toLowerCase();
|
||||||
|
if (deny.has(name)) return false;
|
||||||
|
if (allow) return allow.has(name);
|
||||||
|
return true;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function wrapSandboxPathGuard(tool: AnyAgentTool, root: string): AnyAgentTool {
|
||||||
|
return {
|
||||||
|
...tool,
|
||||||
|
execute: async (toolCallId, args, signal, onUpdate) => {
|
||||||
|
const record =
|
||||||
|
args && typeof args === "object"
|
||||||
|
? (args as Record<string, unknown>)
|
||||||
|
: undefined;
|
||||||
|
const filePath = record?.path;
|
||||||
|
if (typeof filePath === "string" && filePath.trim()) {
|
||||||
|
await assertSandboxPath({ filePath, cwd: root, root });
|
||||||
|
}
|
||||||
|
return tool.execute(toolCallId, args, signal, onUpdate);
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function createSandboxedReadTool(root: string) {
|
||||||
|
const base = createReadTool(root);
|
||||||
|
return wrapSandboxPathGuard(createClawdisReadTool(base), root);
|
||||||
|
}
|
||||||
|
|
||||||
|
function createSandboxedWriteTool(root: string) {
|
||||||
|
const base = createWriteTool(root);
|
||||||
|
return wrapSandboxPathGuard(base as unknown as AnyAgentTool, root);
|
||||||
|
}
|
||||||
|
|
||||||
|
function createSandboxedEditTool(root: string) {
|
||||||
|
const base = createEditTool(root);
|
||||||
|
return wrapSandboxPathGuard(base as unknown as AnyAgentTool, root);
|
||||||
|
}
|
||||||
|
|
||||||
function createWhatsAppLoginTool(): AnyAgentTool {
|
function createWhatsAppLoginTool(): AnyAgentTool {
|
||||||
return {
|
return {
|
||||||
label: "WhatsApp Login",
|
label: "WhatsApp Login",
|
||||||
@@ -383,19 +444,45 @@ function shouldIncludeDiscordTool(surface?: string): boolean {
|
|||||||
export function createClawdisCodingTools(options?: {
|
export function createClawdisCodingTools(options?: {
|
||||||
bash?: BashToolDefaults & ProcessToolDefaults;
|
bash?: BashToolDefaults & ProcessToolDefaults;
|
||||||
surface?: string;
|
surface?: string;
|
||||||
|
sandbox?: SandboxContext | null;
|
||||||
}): AnyAgentTool[] {
|
}): AnyAgentTool[] {
|
||||||
const bashToolName = "bash";
|
const bashToolName = "bash";
|
||||||
|
const sandbox = options?.sandbox?.enabled ? options.sandbox : undefined;
|
||||||
|
const sandboxRoot = sandbox?.workspaceDir;
|
||||||
const base = (codingTools as unknown as AnyAgentTool[]).flatMap((tool) => {
|
const base = (codingTools as unknown as AnyAgentTool[]).flatMap((tool) => {
|
||||||
if (tool.name === readTool.name) return [createClawdisReadTool(tool)];
|
if (tool.name === readTool.name) {
|
||||||
|
return sandboxRoot
|
||||||
|
? [createSandboxedReadTool(sandboxRoot)]
|
||||||
|
: [createClawdisReadTool(tool)];
|
||||||
|
}
|
||||||
if (tool.name === bashToolName) return [];
|
if (tool.name === bashToolName) return [];
|
||||||
|
if (sandboxRoot && (tool.name === "write" || tool.name === "edit")) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
return [tool as AnyAgentTool];
|
return [tool as AnyAgentTool];
|
||||||
});
|
});
|
||||||
const bashTool = createBashTool(options?.bash);
|
const bashTool = createBashTool({
|
||||||
|
...options?.bash,
|
||||||
|
sandbox: sandbox
|
||||||
|
? {
|
||||||
|
containerName: sandbox.containerName,
|
||||||
|
workspaceDir: sandbox.workspaceDir,
|
||||||
|
containerWorkdir: sandbox.containerWorkdir,
|
||||||
|
env: sandbox.docker.env,
|
||||||
|
}
|
||||||
|
: undefined,
|
||||||
|
});
|
||||||
const processTool = createProcessTool({
|
const processTool = createProcessTool({
|
||||||
cleanupMs: options?.bash?.cleanupMs,
|
cleanupMs: options?.bash?.cleanupMs,
|
||||||
});
|
});
|
||||||
const tools: AnyAgentTool[] = [
|
const tools: AnyAgentTool[] = [
|
||||||
...base,
|
...base,
|
||||||
|
...(sandboxRoot
|
||||||
|
? [
|
||||||
|
createSandboxedEditTool(sandboxRoot),
|
||||||
|
createSandboxedWriteTool(sandboxRoot),
|
||||||
|
]
|
||||||
|
: []),
|
||||||
bashTool as unknown as AnyAgentTool,
|
bashTool as unknown as AnyAgentTool,
|
||||||
processTool as unknown as AnyAgentTool,
|
processTool as unknown as AnyAgentTool,
|
||||||
createWhatsAppLoginTool(),
|
createWhatsAppLoginTool(),
|
||||||
@@ -405,5 +492,8 @@ export function createClawdisCodingTools(options?: {
|
|||||||
const filtered = allowDiscord
|
const filtered = allowDiscord
|
||||||
? tools
|
? tools
|
||||||
: tools.filter((tool) => tool.name !== "discord");
|
: tools.filter((tool) => tool.name !== "discord");
|
||||||
return filtered.map(normalizeToolParameters);
|
const sandboxed = sandbox
|
||||||
|
? filterToolsByPolicy(filtered, sandbox.tools)
|
||||||
|
: filtered;
|
||||||
|
return sandboxed.map(normalizeToolParameters);
|
||||||
}
|
}
|
||||||
|
|||||||
83
src/agents/sandbox-paths.ts
Normal file
83
src/agents/sandbox-paths.ts
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
import fs from "node:fs/promises";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
|
||||||
|
const UNICODE_SPACES = /[\u00A0\u2000-\u200A\u202F\u205F\u3000]/g;
|
||||||
|
|
||||||
|
function normalizeUnicodeSpaces(str: string): string {
|
||||||
|
return str.replace(UNICODE_SPACES, " ");
|
||||||
|
}
|
||||||
|
|
||||||
|
function expandPath(filePath: string): string {
|
||||||
|
const normalized = normalizeUnicodeSpaces(filePath);
|
||||||
|
if (normalized === "~") {
|
||||||
|
return os.homedir();
|
||||||
|
}
|
||||||
|
if (normalized.startsWith("~/")) {
|
||||||
|
return os.homedir() + normalized.slice(1);
|
||||||
|
}
|
||||||
|
return normalized;
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolveToCwd(filePath: string, cwd: string): string {
|
||||||
|
const expanded = expandPath(filePath);
|
||||||
|
if (path.isAbsolute(expanded)) return expanded;
|
||||||
|
return path.resolve(cwd, expanded);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveSandboxPath(params: {
|
||||||
|
filePath: string;
|
||||||
|
cwd: string;
|
||||||
|
root: string;
|
||||||
|
}): { resolved: string; relative: string } {
|
||||||
|
const resolved = resolveToCwd(params.filePath, params.cwd);
|
||||||
|
const rootResolved = path.resolve(params.root);
|
||||||
|
const relative = path.relative(rootResolved, resolved);
|
||||||
|
if (!relative || relative === "") {
|
||||||
|
return { resolved, relative: "" };
|
||||||
|
}
|
||||||
|
if (relative.startsWith("..") || path.isAbsolute(relative)) {
|
||||||
|
throw new Error(
|
||||||
|
`Path escapes sandbox root (${shortPath(rootResolved)}): ${params.filePath}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return { resolved, relative };
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function assertSandboxPath(params: {
|
||||||
|
filePath: string;
|
||||||
|
cwd: string;
|
||||||
|
root: string;
|
||||||
|
}) {
|
||||||
|
const resolved = resolveSandboxPath(params);
|
||||||
|
await assertNoSymlink(resolved.relative, path.resolve(params.root));
|
||||||
|
return resolved;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function assertNoSymlink(relative: string, root: string) {
|
||||||
|
if (!relative) return;
|
||||||
|
const parts = relative.split(path.sep).filter(Boolean);
|
||||||
|
let current = root;
|
||||||
|
for (const part of parts) {
|
||||||
|
current = path.join(current, part);
|
||||||
|
try {
|
||||||
|
const stat = await fs.lstat(current);
|
||||||
|
if (stat.isSymbolicLink()) {
|
||||||
|
throw new Error(`Symlink not allowed in sandbox path: ${current}`);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
const anyErr = err as { code?: string };
|
||||||
|
if (anyErr.code === "ENOENT") {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function shortPath(value: string) {
|
||||||
|
if (value.startsWith(os.homedir())) {
|
||||||
|
return `~${value.slice(os.homedir().length)}`;
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
438
src/agents/sandbox.ts
Normal file
438
src/agents/sandbox.ts
Normal file
@@ -0,0 +1,438 @@
|
|||||||
|
import { spawn } from "node:child_process";
|
||||||
|
import crypto from "node:crypto";
|
||||||
|
import fs from "node:fs/promises";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
|
||||||
|
import type { ClawdisConfig } from "../config/config.js";
|
||||||
|
import { STATE_DIR_CLAWDIS } from "../config/config.js";
|
||||||
|
import { defaultRuntime } from "../runtime.js";
|
||||||
|
import { resolveUserPath } from "../utils.js";
|
||||||
|
import {
|
||||||
|
DEFAULT_AGENT_WORKSPACE_DIR,
|
||||||
|
DEFAULT_AGENTS_FILENAME,
|
||||||
|
DEFAULT_BOOTSTRAP_FILENAME,
|
||||||
|
DEFAULT_IDENTITY_FILENAME,
|
||||||
|
DEFAULT_SOUL_FILENAME,
|
||||||
|
DEFAULT_TOOLS_FILENAME,
|
||||||
|
DEFAULT_USER_FILENAME,
|
||||||
|
ensureAgentWorkspace,
|
||||||
|
} from "./workspace.js";
|
||||||
|
|
||||||
|
export type SandboxToolPolicy = {
|
||||||
|
allow?: string[];
|
||||||
|
deny?: string[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type SandboxDockerConfig = {
|
||||||
|
image: string;
|
||||||
|
containerPrefix: string;
|
||||||
|
workdir: string;
|
||||||
|
readOnlyRoot: boolean;
|
||||||
|
tmpfs: string[];
|
||||||
|
network: string;
|
||||||
|
user?: string;
|
||||||
|
capDrop: string[];
|
||||||
|
env?: Record<string, string>;
|
||||||
|
setupCommand?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type SandboxPruneConfig = {
|
||||||
|
idleHours: number;
|
||||||
|
maxAgeDays: number;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type SandboxConfig = {
|
||||||
|
mode: "off" | "non-main" | "all";
|
||||||
|
perSession: boolean;
|
||||||
|
workspaceRoot: string;
|
||||||
|
docker: SandboxDockerConfig;
|
||||||
|
tools: SandboxToolPolicy;
|
||||||
|
prune: SandboxPruneConfig;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type SandboxContext = {
|
||||||
|
enabled: boolean;
|
||||||
|
sessionKey: string;
|
||||||
|
workspaceDir: string;
|
||||||
|
containerName: string;
|
||||||
|
containerWorkdir: string;
|
||||||
|
docker: SandboxDockerConfig;
|
||||||
|
tools: SandboxToolPolicy;
|
||||||
|
};
|
||||||
|
|
||||||
|
const DEFAULT_SANDBOX_WORKSPACE_ROOT = path.join(
|
||||||
|
os.homedir(),
|
||||||
|
".clawdis",
|
||||||
|
"sandboxes",
|
||||||
|
);
|
||||||
|
const DEFAULT_SANDBOX_IMAGE = "clawdis-sandbox:bookworm-slim";
|
||||||
|
const DEFAULT_SANDBOX_CONTAINER_PREFIX = "clawdis-sbx-";
|
||||||
|
const DEFAULT_SANDBOX_WORKDIR = "/workspace";
|
||||||
|
const DEFAULT_SANDBOX_IDLE_HOURS = 24;
|
||||||
|
const DEFAULT_SANDBOX_MAX_AGE_DAYS = 7;
|
||||||
|
const DEFAULT_TOOL_ALLOW = ["bash", "process", "read", "write", "edit"];
|
||||||
|
const DEFAULT_TOOL_DENY = [
|
||||||
|
"browser",
|
||||||
|
"canvas",
|
||||||
|
"nodes",
|
||||||
|
"cron",
|
||||||
|
"discord",
|
||||||
|
"gateway",
|
||||||
|
];
|
||||||
|
|
||||||
|
const SANDBOX_STATE_DIR = path.join(STATE_DIR_CLAWDIS, "sandbox");
|
||||||
|
const SANDBOX_REGISTRY_PATH = path.join(SANDBOX_STATE_DIR, "containers.json");
|
||||||
|
|
||||||
|
type SandboxRegistryEntry = {
|
||||||
|
containerName: string;
|
||||||
|
sessionKey: string;
|
||||||
|
createdAtMs: number;
|
||||||
|
lastUsedAtMs: number;
|
||||||
|
image: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type SandboxRegistry = {
|
||||||
|
entries: SandboxRegistryEntry[];
|
||||||
|
};
|
||||||
|
|
||||||
|
let lastPruneAtMs = 0;
|
||||||
|
|
||||||
|
function defaultSandboxConfig(cfg?: ClawdisConfig): SandboxConfig {
|
||||||
|
const agent = cfg?.agent?.sandbox;
|
||||||
|
return {
|
||||||
|
mode: agent?.mode ?? "off",
|
||||||
|
perSession: agent?.perSession ?? true,
|
||||||
|
workspaceRoot: agent?.workspaceRoot ?? DEFAULT_SANDBOX_WORKSPACE_ROOT,
|
||||||
|
docker: {
|
||||||
|
image: agent?.docker?.image ?? DEFAULT_SANDBOX_IMAGE,
|
||||||
|
containerPrefix:
|
||||||
|
agent?.docker?.containerPrefix ?? DEFAULT_SANDBOX_CONTAINER_PREFIX,
|
||||||
|
workdir: agent?.docker?.workdir ?? DEFAULT_SANDBOX_WORKDIR,
|
||||||
|
readOnlyRoot: agent?.docker?.readOnlyRoot ?? true,
|
||||||
|
tmpfs: agent?.docker?.tmpfs ?? ["/tmp", "/var/tmp", "/run"],
|
||||||
|
network: agent?.docker?.network ?? "bridge",
|
||||||
|
user: agent?.docker?.user,
|
||||||
|
capDrop: agent?.docker?.capDrop ?? ["ALL"],
|
||||||
|
env: agent?.docker?.env ?? { LANG: "C.UTF-8" },
|
||||||
|
setupCommand: agent?.docker?.setupCommand,
|
||||||
|
},
|
||||||
|
tools: {
|
||||||
|
allow: agent?.tools?.allow ?? DEFAULT_TOOL_ALLOW,
|
||||||
|
deny: agent?.tools?.deny ?? DEFAULT_TOOL_DENY,
|
||||||
|
},
|
||||||
|
prune: {
|
||||||
|
idleHours: agent?.prune?.idleHours ?? DEFAULT_SANDBOX_IDLE_HOURS,
|
||||||
|
maxAgeDays: agent?.prune?.maxAgeDays ?? DEFAULT_SANDBOX_MAX_AGE_DAYS,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function shouldSandboxSession(
|
||||||
|
cfg: SandboxConfig,
|
||||||
|
sessionKey: string,
|
||||||
|
mainKey: string,
|
||||||
|
) {
|
||||||
|
if (cfg.mode === "off") return false;
|
||||||
|
if (cfg.mode === "all") return true;
|
||||||
|
return sessionKey.trim() !== mainKey.trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
function slugifySessionKey(value: string) {
|
||||||
|
const trimmed = value.trim() || "session";
|
||||||
|
const hash = crypto
|
||||||
|
.createHash("sha1")
|
||||||
|
.update(trimmed)
|
||||||
|
.digest("hex")
|
||||||
|
.slice(0, 8);
|
||||||
|
const safe = trimmed
|
||||||
|
.toLowerCase()
|
||||||
|
.replace(/[^a-z0-9._-]+/g, "-")
|
||||||
|
.replace(/^-+|-+$/g, "");
|
||||||
|
const base = safe.slice(0, 32) || "session";
|
||||||
|
return `${base}-${hash}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolveSandboxWorkspaceDir(root: string, sessionKey: string) {
|
||||||
|
const resolvedRoot = resolveUserPath(root);
|
||||||
|
const slug = slugifySessionKey(sessionKey);
|
||||||
|
return path.join(resolvedRoot, slug);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function readRegistry(): Promise<SandboxRegistry> {
|
||||||
|
try {
|
||||||
|
const raw = await fs.readFile(SANDBOX_REGISTRY_PATH, "utf-8");
|
||||||
|
const parsed = JSON.parse(raw) as SandboxRegistry;
|
||||||
|
if (parsed && Array.isArray(parsed.entries)) return parsed;
|
||||||
|
} catch {
|
||||||
|
// ignore
|
||||||
|
}
|
||||||
|
return { entries: [] };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function writeRegistry(registry: SandboxRegistry) {
|
||||||
|
await fs.mkdir(SANDBOX_STATE_DIR, { recursive: true });
|
||||||
|
await fs.writeFile(
|
||||||
|
SANDBOX_REGISTRY_PATH,
|
||||||
|
`${JSON.stringify(registry, null, 2)}\n`,
|
||||||
|
"utf-8",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function updateRegistry(entry: SandboxRegistryEntry) {
|
||||||
|
const registry = await readRegistry();
|
||||||
|
const existing = registry.entries.find(
|
||||||
|
(item) => item.containerName === entry.containerName,
|
||||||
|
);
|
||||||
|
const next = registry.entries.filter(
|
||||||
|
(item) => item.containerName !== entry.containerName,
|
||||||
|
);
|
||||||
|
next.push({
|
||||||
|
...entry,
|
||||||
|
createdAtMs: existing?.createdAtMs ?? entry.createdAtMs,
|
||||||
|
image: existing?.image ?? entry.image,
|
||||||
|
});
|
||||||
|
await writeRegistry({ entries: next });
|
||||||
|
}
|
||||||
|
|
||||||
|
async function removeRegistryEntry(containerName: string) {
|
||||||
|
const registry = await readRegistry();
|
||||||
|
const next = registry.entries.filter(
|
||||||
|
(item) => item.containerName !== containerName,
|
||||||
|
);
|
||||||
|
if (next.length === registry.entries.length) return;
|
||||||
|
await writeRegistry({ entries: next });
|
||||||
|
}
|
||||||
|
|
||||||
|
function execDocker(args: string[], opts?: { allowFailure?: boolean }) {
|
||||||
|
return new Promise<{ stdout: string; stderr: string; code: number }>(
|
||||||
|
(resolve, reject) => {
|
||||||
|
const child = spawn("docker", args, {
|
||||||
|
stdio: ["ignore", "pipe", "pipe"],
|
||||||
|
});
|
||||||
|
let stdout = "";
|
||||||
|
let stderr = "";
|
||||||
|
child.stdout?.on("data", (chunk) => {
|
||||||
|
stdout += chunk.toString();
|
||||||
|
});
|
||||||
|
child.stderr?.on("data", (chunk) => {
|
||||||
|
stderr += chunk.toString();
|
||||||
|
});
|
||||||
|
child.on("close", (code) => {
|
||||||
|
const exitCode = code ?? 0;
|
||||||
|
if (exitCode !== 0 && !opts?.allowFailure) {
|
||||||
|
reject(new Error(stderr.trim() || `docker ${args.join(" ")} failed`));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
resolve({ stdout, stderr, code: exitCode });
|
||||||
|
});
|
||||||
|
},
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function dockerImageExists(image: string) {
|
||||||
|
const result = await execDocker(["image", "inspect", image], {
|
||||||
|
allowFailure: true,
|
||||||
|
});
|
||||||
|
return result.code === 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureDockerImage(image: string) {
|
||||||
|
const exists = await dockerImageExists(image);
|
||||||
|
if (exists) return;
|
||||||
|
if (image === DEFAULT_SANDBOX_IMAGE) {
|
||||||
|
await execDocker(["pull", "debian:bookworm-slim"]);
|
||||||
|
await execDocker(["tag", "debian:bookworm-slim", DEFAULT_SANDBOX_IMAGE]);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
throw new Error(`Sandbox image not found: ${image}. Build or pull it first.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function dockerContainerState(name: string) {
|
||||||
|
const result = await execDocker(
|
||||||
|
["inspect", "-f", "{{.State.Running}}", name],
|
||||||
|
{ allowFailure: true },
|
||||||
|
);
|
||||||
|
if (result.code !== 0) return { exists: false, running: false };
|
||||||
|
return { exists: true, running: result.stdout.trim() === "true" };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureSandboxWorkspace(workspaceDir: string, seedFrom?: string) {
|
||||||
|
await fs.mkdir(workspaceDir, { recursive: true });
|
||||||
|
if (seedFrom) {
|
||||||
|
const seed = resolveUserPath(seedFrom);
|
||||||
|
const files = [
|
||||||
|
DEFAULT_AGENTS_FILENAME,
|
||||||
|
DEFAULT_SOUL_FILENAME,
|
||||||
|
DEFAULT_TOOLS_FILENAME,
|
||||||
|
DEFAULT_IDENTITY_FILENAME,
|
||||||
|
DEFAULT_USER_FILENAME,
|
||||||
|
DEFAULT_BOOTSTRAP_FILENAME,
|
||||||
|
];
|
||||||
|
for (const name of files) {
|
||||||
|
const src = path.join(seed, name);
|
||||||
|
const dest = path.join(workspaceDir, name);
|
||||||
|
try {
|
||||||
|
await fs.access(dest);
|
||||||
|
} catch {
|
||||||
|
try {
|
||||||
|
const content = await fs.readFile(src, "utf-8");
|
||||||
|
await fs.writeFile(dest, content, { encoding: "utf-8", flag: "wx" });
|
||||||
|
} catch {
|
||||||
|
// ignore missing seed file
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
await ensureAgentWorkspace({ dir: workspaceDir, ensureBootstrapFiles: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
async function createSandboxContainer(params: {
|
||||||
|
name: string;
|
||||||
|
cfg: SandboxDockerConfig;
|
||||||
|
workspaceDir: string;
|
||||||
|
sessionKey: string;
|
||||||
|
}) {
|
||||||
|
const { name, cfg, workspaceDir, sessionKey } = params;
|
||||||
|
await ensureDockerImage(cfg.image);
|
||||||
|
|
||||||
|
const args = ["create", "--name", name];
|
||||||
|
args.push("--label", "clawdis.sandbox=1");
|
||||||
|
args.push("--label", `clawdis.sessionKey=${sessionKey}`);
|
||||||
|
args.push("--label", `clawdis.createdAtMs=${Date.now()}`);
|
||||||
|
if (cfg.readOnlyRoot) args.push("--read-only");
|
||||||
|
for (const entry of cfg.tmpfs) {
|
||||||
|
args.push("--tmpfs", entry);
|
||||||
|
}
|
||||||
|
if (cfg.network) args.push("--network", cfg.network);
|
||||||
|
if (cfg.user) args.push("--user", cfg.user);
|
||||||
|
for (const cap of cfg.capDrop) {
|
||||||
|
args.push("--cap-drop", cap);
|
||||||
|
}
|
||||||
|
args.push("--security-opt", "no-new-privileges");
|
||||||
|
args.push("--workdir", cfg.workdir);
|
||||||
|
args.push("-v", `${workspaceDir}:${cfg.workdir}`);
|
||||||
|
args.push(cfg.image, "sleep", "infinity");
|
||||||
|
|
||||||
|
await execDocker(args);
|
||||||
|
await execDocker(["start", name]);
|
||||||
|
|
||||||
|
if (cfg.setupCommand?.trim()) {
|
||||||
|
await execDocker(["exec", "-i", name, "sh", "-lc", cfg.setupCommand]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureSandboxContainer(params: {
|
||||||
|
sessionKey: string;
|
||||||
|
workspaceDir: string;
|
||||||
|
cfg: SandboxConfig;
|
||||||
|
}) {
|
||||||
|
const slug = params.cfg.perSession
|
||||||
|
? slugifySessionKey(params.sessionKey)
|
||||||
|
: "shared";
|
||||||
|
const name = `${params.cfg.docker.containerPrefix}${slug}`;
|
||||||
|
const containerName = name.slice(0, 63);
|
||||||
|
const state = await dockerContainerState(containerName);
|
||||||
|
if (!state.exists) {
|
||||||
|
await createSandboxContainer({
|
||||||
|
name: containerName,
|
||||||
|
cfg: params.cfg.docker,
|
||||||
|
workspaceDir: params.workspaceDir,
|
||||||
|
sessionKey: params.sessionKey,
|
||||||
|
});
|
||||||
|
} else if (!state.running) {
|
||||||
|
await execDocker(["start", containerName]);
|
||||||
|
}
|
||||||
|
const now = Date.now();
|
||||||
|
await updateRegistry({
|
||||||
|
containerName,
|
||||||
|
sessionKey: params.sessionKey,
|
||||||
|
createdAtMs: now,
|
||||||
|
lastUsedAtMs: now,
|
||||||
|
image: params.cfg.docker.image,
|
||||||
|
});
|
||||||
|
return containerName;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function pruneSandboxContainers(cfg: SandboxConfig) {
|
||||||
|
const now = Date.now();
|
||||||
|
const idleHours = cfg.prune.idleHours;
|
||||||
|
const maxAgeDays = cfg.prune.maxAgeDays;
|
||||||
|
if (idleHours === 0 && maxAgeDays === 0) return;
|
||||||
|
const registry = await readRegistry();
|
||||||
|
for (const entry of registry.entries) {
|
||||||
|
const idleMs = now - entry.lastUsedAtMs;
|
||||||
|
const ageMs = now - entry.createdAtMs;
|
||||||
|
if (
|
||||||
|
(idleHours > 0 && idleMs > idleHours * 60 * 60 * 1000) ||
|
||||||
|
(maxAgeDays > 0 && ageMs > maxAgeDays * 24 * 60 * 60 * 1000)
|
||||||
|
) {
|
||||||
|
try {
|
||||||
|
await execDocker(["rm", "-f", entry.containerName], {
|
||||||
|
allowFailure: true,
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// ignore prune failures
|
||||||
|
} finally {
|
||||||
|
await removeRegistryEntry(entry.containerName);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function maybePruneSandboxes(cfg: SandboxConfig) {
|
||||||
|
const now = Date.now();
|
||||||
|
if (now - lastPruneAtMs < 5 * 60 * 1000) return;
|
||||||
|
lastPruneAtMs = now;
|
||||||
|
try {
|
||||||
|
await pruneSandboxContainers(cfg);
|
||||||
|
} catch (error) {
|
||||||
|
const message =
|
||||||
|
error instanceof Error
|
||||||
|
? error.message
|
||||||
|
: typeof error === "string"
|
||||||
|
? error
|
||||||
|
: JSON.stringify(error);
|
||||||
|
defaultRuntime.error?.(
|
||||||
|
`Sandbox prune failed: ${message ?? "unknown error"}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function resolveSandboxContext(params: {
|
||||||
|
config?: ClawdisConfig;
|
||||||
|
sessionKey?: string;
|
||||||
|
workspaceDir?: string;
|
||||||
|
}): Promise<SandboxContext | null> {
|
||||||
|
const rawSessionKey = params.sessionKey?.trim();
|
||||||
|
if (!rawSessionKey) return null;
|
||||||
|
const cfg = defaultSandboxConfig(params.config);
|
||||||
|
const mainKey = params.config?.session?.mainKey?.trim() || "main";
|
||||||
|
if (!shouldSandboxSession(cfg, rawSessionKey, mainKey)) return null;
|
||||||
|
|
||||||
|
await maybePruneSandboxes(cfg);
|
||||||
|
|
||||||
|
const workspaceRoot = resolveUserPath(cfg.workspaceRoot);
|
||||||
|
const workspaceDir = cfg.perSession
|
||||||
|
? resolveSandboxWorkspaceDir(workspaceRoot, rawSessionKey)
|
||||||
|
: workspaceRoot;
|
||||||
|
const seedWorkspace =
|
||||||
|
params.workspaceDir?.trim() || DEFAULT_AGENT_WORKSPACE_DIR;
|
||||||
|
await ensureSandboxWorkspace(workspaceDir, seedWorkspace);
|
||||||
|
|
||||||
|
const containerName = await ensureSandboxContainer({
|
||||||
|
sessionKey: rawSessionKey,
|
||||||
|
workspaceDir,
|
||||||
|
cfg,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
enabled: true,
|
||||||
|
sessionKey: rawSessionKey,
|
||||||
|
workspaceDir,
|
||||||
|
containerName,
|
||||||
|
containerWorkdir: cfg.docker.workdir,
|
||||||
|
docker: cfg.docker,
|
||||||
|
tools: cfg.tools,
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -33,7 +33,14 @@ type GatewayRunSignalAction = "stop" | "restart";
|
|||||||
|
|
||||||
function parsePort(raw: unknown): number | null {
|
function parsePort(raw: unknown): number | null {
|
||||||
if (raw === undefined || raw === null) return null;
|
if (raw === undefined || raw === null) return null;
|
||||||
const parsed = Number.parseInt(String(raw), 10);
|
const value =
|
||||||
|
typeof raw === "string"
|
||||||
|
? raw
|
||||||
|
: typeof raw === "number" || typeof raw === "bigint"
|
||||||
|
? raw.toString()
|
||||||
|
: null;
|
||||||
|
if (value === null) return null;
|
||||||
|
const parsed = Number.parseInt(value, 10);
|
||||||
if (!Number.isFinite(parsed) || parsed <= 0) return null;
|
if (!Number.isFinite(parsed) || parsed <= 0) return null;
|
||||||
return parsed;
|
return parsed;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -634,6 +634,50 @@ export type ClawdisConfig = {
|
|||||||
/** How long to keep finished sessions in memory (ms). */
|
/** How long to keep finished sessions in memory (ms). */
|
||||||
cleanupMs?: number;
|
cleanupMs?: number;
|
||||||
};
|
};
|
||||||
|
/** Optional sandbox settings for non-main sessions. */
|
||||||
|
sandbox?: {
|
||||||
|
/** Enable sandboxing for sessions. */
|
||||||
|
mode?: "off" | "non-main" | "all";
|
||||||
|
/** Use one container per session (recommended for hard isolation). */
|
||||||
|
perSession?: boolean;
|
||||||
|
/** Root directory for sandbox workspaces. */
|
||||||
|
workspaceRoot?: string;
|
||||||
|
/** Docker-specific sandbox settings. */
|
||||||
|
docker?: {
|
||||||
|
/** Docker image to use for sandbox containers. */
|
||||||
|
image?: string;
|
||||||
|
/** Prefix for sandbox container names. */
|
||||||
|
containerPrefix?: string;
|
||||||
|
/** Container workdir mount path (default: /workspace). */
|
||||||
|
workdir?: string;
|
||||||
|
/** Run container rootfs read-only. */
|
||||||
|
readOnlyRoot?: boolean;
|
||||||
|
/** Extra tmpfs mounts for read-only containers. */
|
||||||
|
tmpfs?: string[];
|
||||||
|
/** Container network mode (bridge|none|custom). */
|
||||||
|
network?: string;
|
||||||
|
/** Container user (uid:gid). */
|
||||||
|
user?: string;
|
||||||
|
/** Drop Linux capabilities. */
|
||||||
|
capDrop?: string[];
|
||||||
|
/** Extra environment variables for sandbox exec. */
|
||||||
|
env?: Record<string, string>;
|
||||||
|
/** Optional setup command run once after container creation. */
|
||||||
|
setupCommand?: string;
|
||||||
|
};
|
||||||
|
/** Tool allow/deny policy (deny wins). */
|
||||||
|
tools?: {
|
||||||
|
allow?: string[];
|
||||||
|
deny?: string[];
|
||||||
|
};
|
||||||
|
/** Auto-prune sandbox containers. */
|
||||||
|
prune?: {
|
||||||
|
/** Prune if idle for more than N hours (0 disables). */
|
||||||
|
idleHours?: number;
|
||||||
|
/** Prune if older than N days (0 disables). */
|
||||||
|
maxAgeDays?: number;
|
||||||
|
};
|
||||||
|
};
|
||||||
};
|
};
|
||||||
routing?: RoutingConfig;
|
routing?: RoutingConfig;
|
||||||
messages?: MessagesConfig;
|
messages?: MessagesConfig;
|
||||||
@@ -1041,6 +1085,41 @@ export const ClawdisSchema = z.object({
|
|||||||
cleanupMs: z.number().int().positive().optional(),
|
cleanupMs: z.number().int().positive().optional(),
|
||||||
})
|
})
|
||||||
.optional(),
|
.optional(),
|
||||||
|
sandbox: z
|
||||||
|
.object({
|
||||||
|
mode: z
|
||||||
|
.union([z.literal("off"), z.literal("non-main"), z.literal("all")])
|
||||||
|
.optional(),
|
||||||
|
perSession: z.boolean().optional(),
|
||||||
|
workspaceRoot: z.string().optional(),
|
||||||
|
docker: z
|
||||||
|
.object({
|
||||||
|
image: z.string().optional(),
|
||||||
|
containerPrefix: z.string().optional(),
|
||||||
|
workdir: z.string().optional(),
|
||||||
|
readOnlyRoot: z.boolean().optional(),
|
||||||
|
tmpfs: z.array(z.string()).optional(),
|
||||||
|
network: z.string().optional(),
|
||||||
|
user: z.string().optional(),
|
||||||
|
capDrop: z.array(z.string()).optional(),
|
||||||
|
env: z.record(z.string(), z.string()).optional(),
|
||||||
|
setupCommand: z.string().optional(),
|
||||||
|
})
|
||||||
|
.optional(),
|
||||||
|
tools: z
|
||||||
|
.object({
|
||||||
|
allow: z.array(z.string()).optional(),
|
||||||
|
deny: z.array(z.string()).optional(),
|
||||||
|
})
|
||||||
|
.optional(),
|
||||||
|
prune: z
|
||||||
|
.object({
|
||||||
|
idleHours: z.number().int().nonnegative().optional(),
|
||||||
|
maxAgeDays: z.number().int().nonnegative().optional(),
|
||||||
|
})
|
||||||
|
.optional(),
|
||||||
|
})
|
||||||
|
.optional(),
|
||||||
})
|
})
|
||||||
.optional(),
|
.optional(),
|
||||||
routing: RoutingSchema,
|
routing: RoutingSchema,
|
||||||
|
|||||||
@@ -129,13 +129,16 @@ function buildBaseHints(): ConfigUiHints {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
for (const [path, label] of Object.entries(FIELD_LABELS)) {
|
for (const [path, label] of Object.entries(FIELD_LABELS)) {
|
||||||
hints[path] = { ...(hints[path] ?? {}), label };
|
const current = hints[path];
|
||||||
|
hints[path] = current ? { ...current, label } : { label };
|
||||||
}
|
}
|
||||||
for (const [path, help] of Object.entries(FIELD_HELP)) {
|
for (const [path, help] of Object.entries(FIELD_HELP)) {
|
||||||
hints[path] = { ...(hints[path] ?? {}), help };
|
const current = hints[path];
|
||||||
|
hints[path] = current ? { ...current, help } : { help };
|
||||||
}
|
}
|
||||||
for (const [path, placeholder] of Object.entries(FIELD_PLACEHOLDERS)) {
|
for (const [path, placeholder] of Object.entries(FIELD_PLACEHOLDERS)) {
|
||||||
hints[path] = { ...(hints[path] ?? {}), placeholder };
|
const current = hints[path];
|
||||||
|
hints[path] = current ? { ...current, placeholder } : { placeholder };
|
||||||
}
|
}
|
||||||
return hints;
|
return hints;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -210,7 +210,7 @@ export function createAgentEventHandler({
|
|||||||
|
|
||||||
const jobState =
|
const jobState =
|
||||||
evt.stream === "job" && typeof evt.data?.state === "string"
|
evt.stream === "job" && typeof evt.data?.state === "string"
|
||||||
? (evt.data.state as "done" | "error" | string)
|
? evt.data.state
|
||||||
: null;
|
: null;
|
||||||
|
|
||||||
if (sessionKey) {
|
if (sessionKey) {
|
||||||
|
|||||||
@@ -10,6 +10,19 @@ import {
|
|||||||
testState,
|
testState,
|
||||||
} from "./test-helpers.js";
|
} from "./test-helpers.js";
|
||||||
|
|
||||||
|
const decodeWsData = (data: unknown): string => {
|
||||||
|
if (typeof data === "string") return data;
|
||||||
|
if (Buffer.isBuffer(data)) return data.toString("utf-8");
|
||||||
|
if (Array.isArray(data)) return Buffer.concat(data).toString("utf-8");
|
||||||
|
if (data instanceof ArrayBuffer) return Buffer.from(data).toString("utf-8");
|
||||||
|
if (ArrayBuffer.isView(data)) {
|
||||||
|
return Buffer.from(data.buffer, data.byteOffset, data.byteLength).toString(
|
||||||
|
"utf-8",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return "";
|
||||||
|
};
|
||||||
|
|
||||||
installGatewayTestHooks();
|
installGatewayTestHooks();
|
||||||
|
|
||||||
describe("gateway server cron", () => {
|
describe("gateway server cron", () => {
|
||||||
@@ -253,7 +266,7 @@ describe("gateway server cron", () => {
|
|||||||
}>((resolve) => {
|
}>((resolve) => {
|
||||||
const timeout = setTimeout(() => resolve(null as never), 8000);
|
const timeout = setTimeout(() => resolve(null as never), 8000);
|
||||||
ws.on("message", (data) => {
|
ws.on("message", (data) => {
|
||||||
const obj = JSON.parse(String(data));
|
const obj = JSON.parse(decodeWsData(data));
|
||||||
if (
|
if (
|
||||||
obj.type === "event" &&
|
obj.type === "event" &&
|
||||||
obj.event === "cron" &&
|
obj.event === "cron" &&
|
||||||
|
|||||||
@@ -20,6 +20,19 @@ import {
|
|||||||
testState,
|
testState,
|
||||||
} from "./test-helpers.js";
|
} from "./test-helpers.js";
|
||||||
|
|
||||||
|
const decodeWsData = (data: unknown): string => {
|
||||||
|
if (typeof data === "string") return data;
|
||||||
|
if (Buffer.isBuffer(data)) return data.toString("utf-8");
|
||||||
|
if (Array.isArray(data)) return Buffer.concat(data).toString("utf-8");
|
||||||
|
if (data instanceof ArrayBuffer) return Buffer.from(data).toString("utf-8");
|
||||||
|
if (ArrayBuffer.isView(data)) {
|
||||||
|
return Buffer.from(data.buffer, data.byteOffset, data.byteLength).toString(
|
||||||
|
"utf-8",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return "";
|
||||||
|
};
|
||||||
|
|
||||||
installGatewayTestHooks();
|
installGatewayTestHooks();
|
||||||
|
|
||||||
describe("gateway server node/bridge", () => {
|
describe("gateway server node/bridge", () => {
|
||||||
@@ -37,7 +50,7 @@ describe("gateway server node/bridge", () => {
|
|||||||
payload?: unknown;
|
payload?: unknown;
|
||||||
}>((resolve) => {
|
}>((resolve) => {
|
||||||
ws.on("message", (data) => {
|
ws.on("message", (data) => {
|
||||||
const obj = JSON.parse(String(data)) as {
|
const obj = JSON.parse(decodeWsData(data)) as {
|
||||||
type?: string;
|
type?: string;
|
||||||
event?: string;
|
event?: string;
|
||||||
payload?: unknown;
|
payload?: unknown;
|
||||||
@@ -83,7 +96,7 @@ describe("gateway server node/bridge", () => {
|
|||||||
payload?: unknown;
|
payload?: unknown;
|
||||||
}>((resolve) => {
|
}>((resolve) => {
|
||||||
ws.on("message", (data) => {
|
ws.on("message", (data) => {
|
||||||
const obj = JSON.parse(String(data)) as {
|
const obj = JSON.parse(decodeWsData(data)) as {
|
||||||
type?: string;
|
type?: string;
|
||||||
event?: string;
|
event?: string;
|
||||||
payload?: unknown;
|
payload?: unknown;
|
||||||
@@ -805,7 +818,7 @@ describe("gateway server node/bridge", () => {
|
|||||||
payload?: unknown;
|
payload?: unknown;
|
||||||
}>((resolve) => {
|
}>((resolve) => {
|
||||||
ws.on("message", (data) => {
|
ws.on("message", (data) => {
|
||||||
const obj = JSON.parse(String(data));
|
const obj = JSON.parse(decodeWsData(data));
|
||||||
if (isVoiceFinalChatEvent(obj)) {
|
if (isVoiceFinalChatEvent(obj)) {
|
||||||
resolve(obj as never);
|
resolve(obj as never);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,6 +6,8 @@ import {
|
|||||||
startServerWithClient,
|
startServerWithClient,
|
||||||
} from "./test-helpers.js";
|
} from "./test-helpers.js";
|
||||||
|
|
||||||
|
const loadConfigHelpers = async () => await import("../config/config.js");
|
||||||
|
|
||||||
installGatewayTestHooks();
|
installGatewayTestHooks();
|
||||||
|
|
||||||
describe("gateway server providers", () => {
|
describe("gateway server providers", () => {
|
||||||
@@ -63,9 +65,8 @@ describe("gateway server providers", () => {
|
|||||||
test("telegram.logout clears bot token from config", async () => {
|
test("telegram.logout clears bot token from config", async () => {
|
||||||
const prevToken = process.env.TELEGRAM_BOT_TOKEN;
|
const prevToken = process.env.TELEGRAM_BOT_TOKEN;
|
||||||
delete process.env.TELEGRAM_BOT_TOKEN;
|
delete process.env.TELEGRAM_BOT_TOKEN;
|
||||||
const { readConfigFileSnapshot, writeConfigFile } = await import(
|
const { readConfigFileSnapshot, writeConfigFile } =
|
||||||
"../config/config.js"
|
await loadConfigHelpers();
|
||||||
);
|
|
||||||
await writeConfigFile({
|
await writeConfigFile({
|
||||||
telegram: {
|
telegram: {
|
||||||
botToken: "123:abc",
|
botToken: "123:abc",
|
||||||
|
|||||||
@@ -322,6 +322,7 @@ export function installGatewayTestHooks() {
|
|||||||
testState.allowFrom = undefined;
|
testState.allowFrom = undefined;
|
||||||
testIsNixMode.value = false;
|
testIsNixMode.value = false;
|
||||||
cronIsolatedRun.mockClear();
|
cronIsolatedRun.mockClear();
|
||||||
|
agentCommand.mockClear();
|
||||||
drainSystemEvents();
|
drainSystemEvents();
|
||||||
resetAgentRunContextForTest();
|
resetAgentRunContextForTest();
|
||||||
const mod = await import("./server.js");
|
const mod = await import("./server.js");
|
||||||
|
|||||||
@@ -132,7 +132,16 @@ class WizardSessionPrompter implements WizardPrompter {
|
|||||||
placeholder: params.placeholder,
|
placeholder: params.placeholder,
|
||||||
executor: "client",
|
executor: "client",
|
||||||
});
|
});
|
||||||
const value = String(res ?? "");
|
const value =
|
||||||
|
res === null || res === undefined
|
||||||
|
? ""
|
||||||
|
: typeof res === "string"
|
||||||
|
? res
|
||||||
|
: typeof res === "number" ||
|
||||||
|
typeof res === "boolean" ||
|
||||||
|
typeof res === "bigint"
|
||||||
|
? String(res)
|
||||||
|
: "";
|
||||||
const error = params.validate?.(value);
|
const error = params.validate?.(value);
|
||||||
if (error) {
|
if (error) {
|
||||||
throw new Error(error);
|
throw new Error(error);
|
||||||
|
|||||||
Reference in New Issue
Block a user