fix: enable lmstudio responses and drop think tags

This commit is contained in:
Peter Steinberger
2025-12-27 00:28:52 +00:00
parent 2477ffd860
commit 7e380bb6f8
6 changed files with 29 additions and 10 deletions

View File

@@ -39,6 +39,7 @@
- Streamed `<think>` segments are stripped before partial replies are emitted.
- System prompt now tags allowlisted owner numbers as the user identity to avoid mistaken “friend” assumptions.
- LM Studio/Ollama replies now require <final> tags; streaming ignores content until <final> begins.
- LM Studio responses API: tools payloads no longer include `strict: null`, and LM Studio no longer gets forced `<think>/<final>` tags.
- `process log` pagination is now line-based (omit `offset` to grab the last N lines).
- macOS WebChat: assistant bubbles now update correctly when toggling light/dark mode.
- macOS: avoid spawning a duplicate gateway process when an external listener already exists.