Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOCUMENTATION] Extend information about Hugging Face PyTorch DLCs for Inference #78

Open
alvarobartt opened this issue Aug 30, 2024 · 0 comments · May be fixed by #83
Open

[DOCUMENTATION] Extend information about Hugging Face PyTorch DLCs for Inference #78

alvarobartt opened this issue Aug 30, 2024 · 0 comments · May be fixed by #83
Assignees
Labels
documentation Improvements or additions to documentation

Comments

@alvarobartt
Copy link
Member

Description

The Hugging Face PyTorch DLCs for Inference support a wide variety of tasks and so on, both the I/O payloads differ depending on which HF_TASK value is set. So on, the README.md should be extended at least to mention the available environment variables, and add some pointers to the I/O payloads for those.

Additionally, within the hf.co/docs/google-cloud (once published) we could add a section on how to run inference on all the supported scenarios, as right now the documentation on that is limited to the existing examples which may not cover every single use case.

@alvarobartt alvarobartt added the documentation Improvements or additions to documentation label Aug 30, 2024
@alvarobartt alvarobartt self-assigned this Aug 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant