We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
其他(或暂未使用)
windows10
[01-30 15:49:41.914] controller.py (189) - [ERROR] : 处理请求时出错 query_id=0 stage=PreProcessor : 无法确定模型 deepseek-r1-distill-llama-70b 的信息,请在元数据中配置
The text was updated successfully, but these errors were encountered:
这个我也经常有,复现步骤有多种
1,保留登录的窗口,关闭另一个窗口,然后在启用关闭的 2,添加自定义模型,修改元文件llm-models,,然后就会出现 3,{随机概率},修改一些参数后可出现。
Sorry, something went wrong.
看文档,provider.json的配置写了:“model:设置要使用的模型名称。此模型必须存在于 llm-models.json 元数据中。”。所以要去data/metadata/llm-models.json中添加你想要使用的模型的相关信息。
在这个目录里面LangBot/data/metadata/llm-models.json添加如下图片中的模型就可以了,根据你的模型来修改,然后重启容器就行了
No branches or pull requests
消息平台适配器
其他(或暂未使用)
运行环境
windows10
异常情况
[01-30 15:49:41.914] controller.py (189) - [ERROR] : 处理请求时出错 query_id=0 stage=PreProcessor : 无法确定模型 deepseek-r1-distill-llama-70b 的信息,请在元数据中配置
复现步骤
[01-30 15:49:41.914] controller.py (189) - [ERROR] : 处理请求时出错 query_id=0 stage=PreProcessor : 无法确定模型 deepseek-r1-distill-llama-70b 的信息,请在元数据中配置
启用的插件
[01-30 15:49:41.914] controller.py (189) - [ERROR] : 处理请求时出错 query_id=0 stage=PreProcessor : 无法确定模型 deepseek-r1-distill-llama-70b 的信息,请在元数据中配置
The text was updated successfully, but these errors were encountered: