Skip to content

Actions: ROCm/vllm

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
3,240 workflow runs
3,240 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Add fp8 support for llama model family on Navi4x (#245)
ruff #877: Commit 4bba092 pushed by qli88
October 25, 2024 21:31 33s main
October 25, 2024 21:31 33s
Add fp8 support for llama model family on Navi4x (#245)
yapf #877: Commit 4bba092 pushed by qli88
October 25, 2024 21:31 2m 49s main
October 25, 2024 21:31 2m 49s
Add fp8 support for llama model family on Navi4x (#245)
clang-format #731: Commit 4bba092 pushed by qli88
October 25, 2024 21:31 23s main
October 25, 2024 21:31 23s
Add fp8 support for llama model family on Navi4x (#245)
mypy #731: Commit 4bba092 pushed by qli88
October 25, 2024 21:31 50s main
October 25, 2024 21:31 50s
Add fp8 support for llama model family on Navi4x
ruff #876: Pull request #245 synchronize by qli88
October 25, 2024 21:20 30s qiang-navi4x-fp8-llama
October 25, 2024 21:20 30s
Add fp8 support for llama model family on Navi4x
yapf #876: Pull request #245 synchronize by qli88
October 25, 2024 21:20 2m 53s qiang-navi4x-fp8-llama
October 25, 2024 21:20 2m 53s
Add fp8 support for llama model family on Navi4x
mypy #730: Pull request #245 synchronize by qli88
October 25, 2024 21:20 46s qiang-navi4x-fp8-llama
October 25, 2024 21:20 46s
Add fp8 support for llama model family on Navi4x
clang-format #730: Pull request #245 synchronize by qli88
October 25, 2024 21:20 23s qiang-navi4x-fp8-llama
October 25, 2024 21:20 23s
Add fp8 support for llama model family on Navi4x
clang-format #729: Pull request #245 synchronize by qli88
October 25, 2024 19:45 24s qiang-navi4x-fp8-llama
October 25, 2024 19:45 24s
Add fp8 support for llama model family on Navi4x
ruff #875: Pull request #245 synchronize by qli88
October 25, 2024 19:45 30s qiang-navi4x-fp8-llama
October 25, 2024 19:45 30s
Add fp8 support for llama model family on Navi4x
mypy #729: Pull request #245 synchronize by qli88
October 25, 2024 19:45 50s qiang-navi4x-fp8-llama
October 25, 2024 19:45 50s
Add fp8 support for llama model family on Navi4x
yapf #875: Pull request #245 synchronize by qli88
October 25, 2024 19:45 2m 56s qiang-navi4x-fp8-llama
October 25, 2024 19:45 2m 56s
Fix kernel cache miss and add RDNA configs
ruff #874: Pull request #246 opened by hyoon1
October 25, 2024 17:40 27s hyoon1:fix_max_seq
October 25, 2024 17:40 27s
Fix kernel cache miss and add RDNA configs
yapf #874: Pull request #246 opened by hyoon1
October 25, 2024 17:40 2m 0s hyoon1:fix_max_seq
October 25, 2024 17:40 2m 0s
Fix kernel cache miss and add RDNA configs
mypy #728: Pull request #246 opened by hyoon1
October 25, 2024 17:40 47s hyoon1:fix_max_seq
October 25, 2024 17:40 47s
Fix kernel cache miss and add RDNA configs
clang-format #728: Pull request #246 opened by hyoon1
October 25, 2024 17:40 23s hyoon1:fix_max_seq
October 25, 2024 17:40 23s
Add fp8 support for llama model family on Navi4x
yapf #873: Pull request #245 opened by qli88
October 25, 2024 05:32 1m 57s qiang-navi4x-fp8-llama
October 25, 2024 05:32 1m 57s
Add fp8 support for llama model family on Navi4x
clang-format #727: Pull request #245 opened by qli88
October 25, 2024 05:32 19s qiang-navi4x-fp8-llama
October 25, 2024 05:32 19s
Add fp8 support for llama model family on Navi4x
ruff #873: Pull request #245 opened by qli88
October 25, 2024 05:32 22s qiang-navi4x-fp8-llama
October 25, 2024 05:32 22s
Add fp8 support for llama model family on Navi4x
mypy #727: Pull request #245 opened by qli88
October 25, 2024 05:32 48s qiang-navi4x-fp8-llama
October 25, 2024 05:32 48s
October 24, 2024 21:31 2m 51s
[Bugfix][Kernel][Misc] Basic support for SmoothQuant, symmetric case …
clang-format #726: Commit c9fc160 pushed by rasmith
October 24, 2024 21:31 22s main
October 24, 2024 21:31 22s
October 24, 2024 21:31 1m 2s
October 24, 2024 21:31 37s