Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TTS curl Internal Server Error #1995

Closed
1 of 3 tasks
tsiens opened this issue Aug 1, 2024 · 6 comments · Fixed by #2009
Closed
1 of 3 tasks

TTS curl Internal Server Error #1995

tsiens opened this issue Aug 1, 2024 · 6 comments · Fixed by #2009
Assignees
Labels
Milestone

Comments

@tsiens
Copy link

tsiens commented Aug 1, 2024

System Info / 系統信息

cuda12.1

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

0.13.3

The command used to start Xinference / 用以启动 xinference 的命令

docker

Reproduction / 复现过程

Xinference python success
but curl error

curl -X 'POST' \
>   'http://127.0.0.1:9997/v1/audio/speech' \
>   -H 'accept: application/json' \
>   -H 'Content-Type: application/json' \
>   -d '{
>     "model": "ChatTTS",
>     "text": "你好",
>     "voice": "中文女"
>   }'
Internal Server Error
curl -X 'POST' \
>   'http://127.0.0.1:9997/v1/audio/speech' \
>   -H 'accept: application/json' \
>   -H 'Content-Type: application/json' \
>   -d '{
>     "model": "CosyVoice-300M-SFT",
>     "text": "你好",
>     "voice": "中文女"
>   }'
Internal Server Error

Expected behavior / 期待表现

return result as

curl -X 'POST' \
>   'http://127.0.0.1:9997/v1/chat/completions' \
>   -H 'accept: application/json' \
>   -H 'Content-Type: application/json' \
>   -d '{
>     "model": "qwen2-instruct",
>     "messages": [
>         {
>             "role": "system",
>             "content": "You are a helpful assistant."
>         },
>         {
>             "role": "user",
>             "content": "What is the largest animal?"
>         }
>     ],
>     "max_tokens": 512,
>     "temperature": 0.7
>   }'
{"id":"chatfad8c3cd-6e0b-49a8-b4c8-f082362712f9","object":"chat.completion","created":1722499076,"model":"qwen2-instruct","choices":[{"index":0,"message":{"role":"assistant","content":"The largest animal is the blue whale (Balaenoptera musculus). Adult blue whales can grow up to 100 feet (30 meters) in length and weigh as much as 200 tons (about 400,000 pounds or 181,437 kilograms). They live in all of the world's oceans, but their populations were severely depleted by commercial whaling activities that historically targeted them. Despite being listed as an endangered species globally since the early 1970s, some populations have shown signs of recovery due to conservation efforts and restrictions on whaling. Blue whales feed primarily on krill, which they catch by opening their mouths wide and engulfing large volumes of water rich with prey particles before filtering it out through comb-like baleen plates hanging from their upper jaws."},"finish_reason":"stop"}],"usage":{"prompt_tokens":25,"completion_tokens":172,"total_tokens":197}}
@XprobeBot XprobeBot added the gpu label Aug 1, 2024
@XprobeBot XprobeBot added this to the v0.14.0 milestone Aug 1, 2024
@qinxuye
Copy link
Contributor

qinxuye commented Aug 1, 2024

Do you have the error stack on the server side?

@tsiens
Copy link
Author

tsiens commented Aug 1, 2024

Do you have the error stack on the server side?

No output at all

@qinxuye
Copy link
Contributor

qinxuye commented Aug 3, 2024

@codingl2k1 Please take look at this issue.

@codingl2k1
Copy link
Contributor

@codingl2k1 Please take look at this issue.

OK

@codingl2k1
Copy link
Contributor

Please try this curl:

curl -X 'POST' \
   'http://127.0.0.1:9997/v1/audio/speech' \
   -H 'accept: application/json' \
   -H 'Content-Type: application/json' \
   -d '{
     "model": "CosyVoice-300M-SFT",
     "input": "你好",
     "voice": "中文女"
   }'

If you want to make a request with binary voice, it should be in the form of a request. Please refer to the Xinference client: https://github.com/xorbitsai/inference/blob/main/xinference/client/restful/restful_client.py#L764

@leslie2046
Copy link
Contributor

stream参数去除即可

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants