docs: add custom memory endpoint example
This commit is contained in:
@@ -149,3 +149,28 @@ Local mode:
|
||||
- When `memorySearch.provider = "local"`, `node-llama-cpp` resolves `modelPath`; if the GGUF is missing it **auto-downloads** to the cache (or `local.modelCacheDir` if set), then loads it. Downloads resume on retry.
|
||||
- Native build requirement: run `pnpm approve-builds`, pick `node-llama-cpp`, then `pnpm rebuild node-llama-cpp`.
|
||||
- Fallback: if local setup fails and `memorySearch.fallback = "openai"`, we automatically switch to remote embeddings (`openai/text-embedding-3-small` unless overridden) and record the reason.
|
||||
|
||||
### Custom OpenAI-compatible endpoint example
|
||||
|
||||
```json5
|
||||
agents: {
|
||||
defaults: {
|
||||
memorySearch: {
|
||||
provider: "openai",
|
||||
model: "text-embedding-3-small",
|
||||
remote: {
|
||||
baseUrl: "https://api.example.com/v1/",
|
||||
apiKey: "YOUR_REMOTE_API_KEY",
|
||||
headers: {
|
||||
"X-Organization": "org-id",
|
||||
"X-Project": "project-id"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Notes:
|
||||
- `remote.*` takes precedence over `models.providers.openai.*`.
|
||||
- `remote.headers` merge with OpenAI headers; remote wins on key conflicts. Omit `remote.headers` to use the OpenAI defaults.
|
||||
|
||||
Reference in New Issue
Block a user