Commit Graph

29 Commits

Author SHA1 Message Date
1e0n
5ddbbbb47c update version 2025-10-13 04:33:20 +08:00
1e0n
732925125f 删除会导致403的敏感词 2025-10-13 04:32:45 +08:00
1e0n
666e58d681 升级到v1.3.2:支持Docker部署FACTORY_API_KEY环境变量配置 2025-10-09 22:34:31 +08:00
1e0n
e0129b7a83 修正Anthropic端点授权转发逻辑:统一使用authorization header转发到上游端点 2025-10-09 15:21:42 +08:00
1e0n
27fdb7e157 修复 /v1/messages 端点 x-api-key 认证逻辑:
- handleDirectMessages 读取客户端 x-api-key header
- getAnthropicHeaders 优先使用客户端 x-api-key,避免同时设置 x-api-key 和 authorization
2025-10-09 15:05:30 +08:00
1e0n
4503604d04 支持Anthropic端点客户端x-api-key:\n- /v1/messages读取x-api-key并优先作为客户端授权传递\n- getAnthropicHeaders转发x-api-key并透传anthropic-version\n- CORS允许X-API-Key与anthropic-version 2025-10-09 14:58:56 +08:00
1e0n
bcdd524a34 升级到v1.3.1:为OpenAI端点的非流式 /v1/chat/completions 返回体转换为OpenAI兼容格式;保持Anthropic/Common非流式直传 2025-10-09 14:41:00 +08:00
1e0n
4a8d7986dd 升级到v1.3.0:新增auto推理模式和完善推理级别文档
主要功能更新:
- 新增auto推理级别,完全遵循客户端原始请求参数
- 支持五档推理级别:auto/off/low/medium/high
- auto模式零干预:不修改推理字段和anthropic-beta头
- 除gpt-5-codex外,所有模型默认设为auto模式

文档完善:
- 更新核心功能说明,突出智能推理级别控制
- 新增auto推理模式详细说明和使用场景
- 添加推理级别对比表格和配置示例
- 增强FAQ部分,分场景解答推理相关问题
- 提供OpenAI和Anthropic模型字段对应关系

技术实现:
- 更新getModelReasoning函数支持auto选项
- 完善所有transformer的auto模式处理逻辑
- 优化routes.js中直接转发端点的auto支持
- 确保auto模式下头信息和请求体完全透传
2025-10-09 13:32:50 +08:00
1e0n
036198cebb 升级到v1.2.2:完善流式处理文档
- 版本升级至1.2.2
- 更新README文档突出智能流式处理功能
- 添加流式和非流式响应的使用示例
- 详细说明stream参数的三种设置方式
- 新增流式响应控制FAQ问答
- 强调完全尊重客户端stream参数设置
2025-10-09 12:01:49 +08:00
1e0n
1b1a25e68d 修复流式参数处理:尊重客户端明确指定的stream参数
- 修正transformers中强制添加stream=true的错误逻辑
- 只有客户端明确指定stream参数时才转发该参数
- 客户端未指定stream时不强制添加,保持原有意图
- 更新routes.js中相应的流式判断逻辑
- 确保非流式请求得到正确处理
2025-10-09 11:50:49 +08:00
1e0n
69fdb27b07 升级到v1.2.1:调整GPT-5推理级别配置
- 版本升级至1.2.1
- 将GPT-5模型推理级别从off调整为high
- 完善推理字段删除逻辑确保配置生效
2025-10-08 22:16:05 +08:00
1e0n
74521c54c3 修复推理字段删除逻辑:当reasoning设置为off时正确删除原始请求中的推理字段
- 修正request-openai.js中reasoning字段的处理逻辑
- 修正request-anthropic.js中thinking字段的处理逻辑
- 确保当模型配置reasoning为off时,原始请求中的推理相关字段被显式删除
- 与routes.js中的直接转发逻辑保持一致
2025-10-08 22:07:41 +08:00
1e0n
21b852b59e 升级到v1.2.0:完善双授权机制文档
- 版本升级至1.2.0
- 更新README核心功能说明,突出双重授权机制
- 添加三级授权优先级配置指南
- 新增FACTORY_API_KEY使用场景说明
- 完善快速开始部分的认证配置步骤
- 增加授权机制相关FAQ问答
2025-10-08 19:44:54 +08:00
1e0n
25f89a12b7 实现双授权系统:支持FACTORY_API_KEY环境变量优先级和客户端授权回退机制
- 新增FACTORY_API_KEY环境变量支持(最高优先级)
- 保留现有refresh token自动刷新机制
- 添加客户端authorization头作为fallback
- 优化启动流程,无认证配置时不报错退出
- 更新所有端点支持新的授权优先级系统
- 修改GPT-5-Codex推理级别为off
2025-10-08 19:42:39 +08:00
1e0n
2dc8c89270 统一User-Agent管理:从config.json读取固定值factory-cli/0.19.3 2025-10-08 18:30:21 +08:00
1e0n
c677d7b429 更新文档:添加Windows启动脚本说明 2025-10-08 15:58:30 +08:00
1e0n
e3e7a918cd 添加Windows启动脚本start.bat 2025-10-08 15:58:02 +08:00
1e0n
3444cbdfdc 完善安装文档:明确说明npm install依赖安装步骤 2025-10-08 15:56:09 +08:00
1e0n
c5ec338fc4 更新文档:重点介绍令牌刷新、推理级别、Docker部署和Claude Code集成功能 2025-10-08 15:42:22 +08:00
1e0n
1c29062ba7 增加大模型推理级别配置 2025-10-08 05:26:31 +08:00
1e0n
191c53da40 优化提示词 2025-10-08 04:24:43 +08:00
1e0n
43803ca9da Add common endpoint support and system prompt injection, v1.1.0
- Add common endpoint type for GLM-4.6 model
- Implement automatic system prompt injection for all requests
- Simplify README documentation for better user focus
- Update version to 1.1.0
- Add *.txt to .gitignore

Co-authored-by: factory-droid[bot] <138933559+factory-droid[bot]@users.noreply.github.com>
2025-10-07 21:06:28 +08:00
1e0n
5fc2df4cd7 Update documentation for new endpoints
- Add overview of three endpoint modes
- Add /v1/responses endpoint documentation (OpenAI transparent proxy)
- Add /v1/messages endpoint documentation (Anthropic transparent proxy)
- Add endpoint comparison table
- Add usage guide for choosing appropriate endpoint
- Clarify format conversion only applies to /v1/chat/completions
- Add detailed examples for each endpoint
- Update feature list and usage instructions
2025-10-07 06:02:29 +08:00
1e0n
4d5ce26e7f Add /v1/messages endpoint for direct Anthropic forwarding
Features:
- Add new /v1/messages endpoint for transparent Anthropic request/response forwarding
- Only supports anthropic type endpoints (rejects openai with 400 error)
- No request transformation - forwards original request body as-is
- No response transformation - streams and non-streaming responses forwarded directly

Now supports three endpoint patterns:
- /v1/chat/completions: Universal with format conversion (anthropic, openai)
- /v1/responses: Direct proxy for openai endpoints only
- /v1/messages: Direct proxy for anthropic endpoints only
2025-10-07 05:26:57 +08:00
1e0n
79616ba3b9 Add /v1/responses endpoint for direct OpenAI forwarding
Features:
- Add new /v1/responses endpoint for transparent request/response forwarding
- Only supports openai type endpoints (rejects anthropic with 400 error)
- No request transformation - forwards original request body as-is
- No response transformation - streams and non-streaming responses forwarded directly
- /v1/chat/completions keeps original behavior with format conversion

Differences between endpoints:
- /v1/chat/completions: Converts formats for all endpoint types (anthropic, openai)
- /v1/responses: Direct proxy for openai endpoints only, zero transformation
2025-10-07 05:14:58 +08:00
1e0n
3aebe7e723 Optimized Log Display 2025-10-07 02:14:55 +08:00
1e0n
1bfbf5a31c Add detailed 404 error logging for invalid requests
- Log invalid request method, URL, path, and parameters
- Display query parameters and request body if present
- Show client IP and User-Agent information
- Return helpful error message with available endpoints
- Format console output with clear visual separators
2025-10-07 01:53:27 +08:00
1e0n
ad862f73d1 Add Docker deployment support
- Add Dockerfile for containerization
- Add docker-compose.yml for easy deployment
- Add .dockerignore to optimize build
- Add .env.example for environment configuration
- Add DOCKER_DEPLOY.md with comprehensive deployment guide

Support for:
- Local Docker deployment
- Cloud platforms (Render, Railway, Fly.io, GCP, AWS)
- Persistent storage configuration
- Health checks and monitoring
2025-10-06 02:17:37 +08:00
1e0n
6dca025e96 Initial commit: OpenAI compatible API proxy with auto token refresh
- Implemented OpenAI compatible API proxy server
- Support for Anthropic and custom OpenAI format conversion
- Automatic API key refresh with WorkOS OAuth
- SSE streaming response transformation
- Smart header management for Factory endpoints
- Chinese documentation
2025-10-06 02:12:01 +08:00