-
Notifications
You must be signed in to change notification settings - Fork 8.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: 希望增加Deepseek V3的API支持 #2105
Comments
这个已经支持deepseek了啊。你用的是什么版本? |
现在使用的是3.91版本。 |
怎么做到的?不存在bridge_deepseek.py这个文件啊,bridge_deepseekcoder.py是本地模型。加上就能用吗? |
#config_private.py中,把deepseek-chat填到AVAIL_LLM_MODELS 中,也可以在LLM_MODEL = "deepseek-chat"中默认选中(也可以后期选) 深度求索(DeepSeek) API KEY,默认请求地址为"https://api.deepseek.com/v1/chat/completions"DEEPSEEK_API_KEY = |
我看到现在有个deepseek-R1了,我在模型预选那里填了deepseek-V1和deepseek-V3,但是都连不了,填deepseek-chat可以连,但是不知道连的是哪个模型啊? |
Deepseek API 提供的就是 deepseek-chat / deepseek-coder / deepseek-reasoner 三个模型,调用 deepseek-chat 实际上就是 V3 模型。API endpoint 里写 V1 是为了兼容 OpenAI 的 BaseUrl。(参见:https://api-docs.deepseek.com/zh-cn/ ) |
Class | 类型
大语言模型
Feature Request | 功能请求
希望增加Deepseek V3的API支持
The text was updated successfully, but these errors were encountered: