实现 AI 回答 v2ex 帖子中的问题,可使用 deepseek 或其他与openai api对齐的LLM,会保留缓存记录到本地避免大量消耗 token。从 https://github.com/dlzmoe/scripts的v2ex AI 总结帖子脚本 修改而来 https://greasyfork.org/zh-CN/users/741071-falconchen
当前为
日期 | 安装次数 | 更新检查总次数 |
---|---|---|
2025-09-17 | 0 | 0 |
2025-09-18 | 0 | 0 |
2025-09-19 | 0 | 0 |
2025-09-20 | 0 | 0 |
2025-09-21 | 0 | 0 |
2025-09-22 | 0 | 0 |
2025-09-23 | 0 | 1 |
2025-09-24 | 0 | 0 |
2025-09-25 | 0 | 0 |
2025-09-26 | 0 | 0 |
2025-09-27 | 0 | 0 |
2025-09-28 | 0 | 1 |
2025-09-29 | 0 | 0 |
2025-09-30 | 0 | 0 |
2025-10-01 | 0 | 0 |
2025-10-02 | 0 | 0 |
2025-10-03 | 0 | 0 |
2025-10-04 | 0 | 0 |
2025-10-05 | 0 | 1 |
2025-10-06 | 0 | 0 |
2025-10-07 | 0 | 0 |
2025-10-08 | 0 | 0 |
2025-10-09 | 0 | 0 |
2025-10-10 | 0 | 0 |
2025-10-11 | 0 | 2 |
2025-10-12 | 0 | 0 |
2025-10-13 | 0 | 0 |
2025-10-14 | 0 | 0 |
2025-10-15 | 0 | 0 |
2025-10-16 | 0 | 0 |