功能: 解锁 AI 模型配置 (FAST_MODEL & SMART_MODEL)#23
Open
Vanilla-Yukirin wants to merge 2 commits intowendy7756:mainfrom
Open
功能: 解锁 AI 模型配置 (FAST_MODEL & SMART_MODEL)#23Vanilla-Yukirin wants to merge 2 commits intowendy7756:mainfrom
Vanilla-Yukirin wants to merge 2 commits intowendy7756:mainfrom
Conversation
默认模型(3-turbo&4o)保持不变,不影响原有功能。
更新 README.md 和 README_ZH.md,以反映用于 AI 模型选择的 新环境变量: - 将 FAST_MODEL 和 SMART_MODEL 添加到“环境变量”表格中 - 将它们包含在“快速开始”的配置示例中 - 在“功能特性”中添加了关于可配置AI模型的新条目 - 新增一个 FAQ 条目,解释如何使用这些变量
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
你好,@wendy7756
我在使用时发现,
summarizer.py中的 AI 模型(gpt-3.5-turbo 和 gpt-4o)是硬编码的。这对于想使用自定义模型的用户来说不太方便。为了解决这个问题,我进行了一个非常小的修改,但它能带来极其强大的灵活性。
核心改动
FAST_MODEL:用于格式化、改错别字等“体力活”。SMART_MODEL:用于摘要、整合、翻译等“脑力活”。__init__.py中添加了默认值。FAST_MODEL默认为gpt-3.5-turboSMART_MODEL默认为gpt-4o一些优点
100% 向后兼容:
对于所有现有用户,如果他们不设置这两个新变量,程序将100% 照常运行,使用的模型和现在完全一样。这个修改对老用户零影响,绝对安全。
极大的灵活性(“强大的地方”):
对于高级用户,他们现在可以随意替换任何他们想用的模型!比如:
FAST_MODEL=GLM-4.5-FlashSMART_MODEL=GLM-4.5-Airllama,claude等任何兼容 OpenAI 接口的模型。这使得项目瞬间从“只能用 OpenAI” 升级为 “适配所有主流 AI 模型”,并且还能让用户在成本和质量之间找到自己的完美平衡点。
我还相应地更新了
README.md和README_ZH.md,在功能特性、配置表格和 FAQ 部分都添加了对这两个新变量的说明。这是一个极小(只改了几行代码)但能带来巨大价值的升级,我强力推荐你合并这个 PR
感谢你的时间和审核!