This repository has been archived by the owner on Oct 11, 2024. It is now read-only.
forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 11
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
update workflows to use generated whls (#204)
SUMMARY: * update NIGHTLY workflow to be whl centric * update benchmarking jobs to use generated whl TEST PLAN: runs on remote push. i'm also triggering NIGHTLY manually. --------- Co-authored-by: andy-neuma <[email protected]> Co-authored-by: Domenic Barbuzzi <[email protected]> Co-authored-by: Domenic Barbuzzi <[email protected]>
- Loading branch information
1 parent
8f55a0c
commit 5c7a85d
Showing
14 changed files
with
267 additions
and
145 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,27 @@ | ||
name: install whl | ||
description: 'installs found whl based on python version into specified venv' | ||
inputs: | ||
python: | ||
description: 'python version, e.g. 3.10.12' | ||
required: true | ||
venv: | ||
description: 'name for python virtual environment' | ||
required: true | ||
runs: | ||
using: composite | ||
steps: | ||
- id: install_whl | ||
run: | | ||
# move source directories | ||
mv vllm vllm-ignore | ||
mv csrc csrc-ignore | ||
# activate and install | ||
COMMIT=${{ github.sha }} | ||
VENV="${{ env.VENV_BASE }}-${COMMIT:0:7}" | ||
source $(pyenv root)/versions/${{ inputs.python }}/envs/${VENV}/bin/activate | ||
pip3 install -r requirements-dev.txt | ||
BASE=$(./.github/scripts/convert-version ${{ inputs.python }}) | ||
WHL=$(find . -type f -iname "*${BASE}*.whl") | ||
WHL_BASENAME=$(basename ${WHL}) | ||
pip3 install ${WHL}[sparse] | ||
shell: bash |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
5c7a85d
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bigger_is_better
{"name": "request_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.2.0", "python_version": "3.10.12 (main, Mar 7 2024, 18:39:53) [GCC 9.4.0]", "torch_version": "2.2.1+cu121"}
4.028484379450804
prompts/s3.80234884054723
prompts/s0.94
{"name": "token_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.2.0", "python_version": "3.10.12 (main, Mar 7 2024, 18:39:53) [GCC 9.4.0]", "torch_version": "2.2.1+cu121"}
1546.9380017091084
tokens/s1460.1019547701362
tokens/s0.94
This comment was automatically generated by workflow using github-action-benchmark.
5c7a85d
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bigger_is_better
{"name": "request_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.2.0", "python_version": "3.10.12 (main, Mar 7 2024, 18:39:53) [GCC 9.4.0]", "torch_version": "2.2.1+cu121"}
4.030431163094849
prompts/s3.80234884054723
prompts/s0.94
{"name": "token_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.2.0", "python_version": "3.10.12 (main, Mar 7 2024, 18:39:53) [GCC 9.4.0]", "torch_version": "2.2.1+cu121"}
1547.6855666284218
tokens/s1460.1019547701362
tokens/s0.94
This comment was automatically generated by workflow using github-action-benchmark.