Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

零一万物等大模型输出截断 #266

Open
ken0311 opened this issue Jun 10, 2024 · 4 comments
Open

零一万物等大模型输出截断 #266

ken0311 opened this issue Jun 10, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@ken0311
Copy link

ken0311 commented Jun 10, 2024

例行检查

  • 我已确认目前没有类似 issue
  • 我已确认我已升级到最新版本
  • 我已完整查看过项目 README,尤其是常见问题部分
  • 我理解并愿意跟进此 issue,协助测试和提供反馈
  • 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

问题描述
在使用零一万物所有模型流式输出长文本时,输出截断,默认没有设置max_token

复现步骤
使用零一万物输出800token 以上的内容,可能会中断
预期结果
请求时为部分模型添加max_token

@ken0311 ken0311 added the bug Something isn't working label Jun 10, 2024
@ken0311
Copy link
Author

ken0311 commented Jun 10, 2024

补充内容

最后返回结果:data: {"id":"****","object":"chat.completion.chunk","created":*****,"model":"yi-large","choices":[{"delta":{"content":"。"},"index":0,"finish_reason":"length"}],疑似因为长度问题,截断了输出(没有添加max_token),添加后则没有问题

@MartialBE
Copy link
Owner

01 不是和OpenAI一样的格式么? 我没有做任何处理,所以 这个max_token参数也是自己需要传递的吧?

@ken0311
Copy link
Author

ken0311 commented Jun 11, 2024

确实需要自行传递,但客户端如果没有传递,输出内容长会中断,而OpenAI似乎不会。是否考虑为部分模型自动添加max_token参数(没有传递的情况下)。不确定实现起来是否有难度

@bahuzh
Copy link

bahuzh commented Jun 15, 2024

01万物的问题,不传递默认好像512,现在不知道多少。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants