Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support string content_type in the output response #23

Open
lizzzcai opened this issue Jan 12, 2023 · 0 comments
Open

Support string content_type in the output response #23

lizzzcai opened this issue Jan 12, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@lizzzcai
Copy link
Member

lizzzcai commented Jan 12, 2023

Follow up of this issue to track the support of str content_type in output response.

expected example:

curl -i -X POST -H "Content-Type: application/json" "http://localhost:8008/v2/models/${MODEL_NAME}/infer" -d '{"inputs": [{ "name": "text", "shape": [2], "datatype": "BYTES", "data": ["I loved this food, it was very good", "I don't loved this food, it was not good"] }] }'

{"model_name":"custom-predictor__ksp-89efd40320","model_version":"v0.1.0","outputs":[{"name":"predictions","datatype":"BYTES","shape":[2],"parameters":{"content_type":{"ParameterChoice":{"StringParam":"str"}}},"data":["compliment","complaint"]}]}%  # currently it is ["Y29tcGxpbWVudA==","Y29tcGxhaW50"] and `content_type` is base64

Motivations:

  1. align with V2 Inference Protocol
  2. make life easier for users
@njhill njhill added the enhancement New feature or request label Jan 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants