-
Notifications
You must be signed in to change notification settings - Fork 358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Text Generation Inference with JSON output #235
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for this PR and your work on it! I left a comment and it seems that the docstrings are not yet updated to match the current implementation. Also, it would be great if you could add a small section on how this works in the docs, see https://github.com/MaartenGr/KeyBERT/blob/master/docs/guides/llms.md
``` | ||
""" | ||
def __init__(self, | ||
url: str, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I propose to pass the entire InferenceClient
rather than just the URL since not all its parameters are exposed at the moment. Moreover, it would then follow the same structure as is done with OpenAI in this repo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I updated the code. Now when constructing TextGenerationInference it accepts InferenceClient. I also added json_schema in case that someone is looking for a different output result.
…ionInference. Updated documentation
I also added the inference_kwargs to the |
Awesome, everything looks good to me! Thank you for the work on this it is highly appreciated 😄 |
Added the option to use TGI from HuggingFace using JSON format. It makes the output more predictable. Also it is useful to use TGI in case we use it for other use cases.