-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support sending additional outputs from vLLM inference #70
Open
kthui
wants to merge
12
commits into
main
Choose a base branch
from
jacky-vllm-additional-outputs
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Commits on Nov 1, 2024
-
Add additional outputs and their input switches to auto complete
* [WIP] Add additional outputs to auto complete * [WIP] Use individual input tensor to control per additional output * [WIP] Parse additional output flags from request
Configuration menu - View commit details
-
Copy full SHA for 10a5b94 - Browse repository at this point
Copy the full SHA 10a5b94View commit details -
Configuration menu - View commit details
-
Copy full SHA for 892f0d0 - Browse repository at this point
Copy the full SHA 892f0d0View commit details -
Configuration menu - View commit details
-
Copy full SHA for 58ee481 - Browse repository at this point
Copy the full SHA 58ee481View commit details
Commits on Nov 2, 2024
-
Add test for additional outputs
* Add additional outputs test * Update copyright * Some test enhancement and notes
Configuration menu - View commit details
-
Copy full SHA for 5e605ca - Browse repository at this point
Copy the full SHA 5e605caView commit details
Commits on Nov 4, 2024
-
Configuration menu - View commit details
-
Copy full SHA for f35e9c4 - Browse repository at this point
Copy the full SHA f35e9c4View commit details
Commits on Nov 5, 2024
-
Configuration menu - View commit details
-
Copy full SHA for e6e6404 - Browse repository at this point
Copy the full SHA e6e6404View commit details
Commits on Nov 6, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 44edd6e - Browse repository at this point
Copy the full SHA 44edd6eView commit details -
Configuration menu - View commit details
-
Copy full SHA for 1773dea - Browse repository at this point
Copy the full SHA 1773deaView commit details
Commits on Nov 7, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 29099df - Browse repository at this point
Copy the full SHA 29099dfView commit details -
Configuration menu - View commit details
-
Copy full SHA for 457eeaa - Browse repository at this point
Copy the full SHA 457eeaaView commit details -
Revert "Return token ids instead of number of token ids"
This reverts commit 457eeaa.
Configuration menu - View commit details
-
Copy full SHA for 5e9b09f - Browse repository at this point
Copy the full SHA 5e9b09fView commit details -
Configuration menu - View commit details
-
Copy full SHA for dae3c13 - Browse repository at this point
Copy the full SHA dae3c13View commit details
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.