Skip to content

Actions: nod-ai/shark-ai

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
15,879 workflow run results
15,879 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Update user docs for running llm server + upgrade gguf to `0.11.0…
CI - sharktank #1545: Commit f7d2681 pushed by stbaione
December 12, 2024 23:45 11m 49s main
December 12, 2024 23:45 11m 49s
Update user docs for running llm server + upgrade gguf to `0.11.0…
pre-commit #3179: Commit f7d2681 pushed by stbaione
December 12, 2024 23:45 38s main
December 12, 2024 23:45 38s
Update user docs for running llm server + upgrade gguf to `0.11.0…
CI - sharktank perplexity short #480: Commit f7d2681 pushed by stbaione
December 12, 2024 23:45 4m 3s main
December 12, 2024 23:45 4m 3s
Update user docs for running llm server + upgrade gguf to `0.11.0…
Llama Benchmarking 8B Tests #1163: Commit f7d2681 pushed by stbaione
December 12, 2024 23:45 6m 2s main
December 12, 2024 23:45 6m 2s
Update user docs for running llm server + upgrade gguf to `0.11.0…
CI - shark-ai #744: Commit f7d2681 pushed by stbaione
December 12, 2024 23:45 9m 40s main
December 12, 2024 23:45 9m 40s
Enable flash attention by default
pre-commit #3178: Pull request #690 synchronize by rsuderman
December 12, 2024 23:23 38s rsuderman:flash_attention_enable
December 12, 2024 23:23 38s
Enable flash attention by default
Llama Benchmarking 8B Tests #1162: Pull request #690 synchronize by rsuderman
December 12, 2024 23:23 6m 19s rsuderman:flash_attention_enable
December 12, 2024 23:23 6m 19s
Enable flash attention by default
CI - shark-ai #743: Pull request #690 synchronize by rsuderman
December 12, 2024 23:23 10m 18s rsuderman:flash_attention_enable
December 12, 2024 23:23 10m 18s
Enable flash attention by default
CI - sharktank #1544: Pull request #690 synchronize by rsuderman
December 12, 2024 23:23 11m 47s rsuderman:flash_attention_enable
December 12, 2024 23:23 11m 47s
Enable flash attention by default
CI - sharktank perplexity short #479: Pull request #690 synchronize by rsuderman
December 12, 2024 23:23 4m 6s rsuderman:flash_attention_enable
December 12, 2024 23:23 4m 6s
Enable flash attention by default
pre-commit #3177: Pull request #690 opened by rsuderman
December 12, 2024 23:04 44s rsuderman:flash_attention_enable
December 12, 2024 23:04 44s
Enable flash attention by default
Llama Benchmarking 8B Tests #1161: Pull request #690 opened by rsuderman
December 12, 2024 23:04 5m 51s rsuderman:flash_attention_enable
December 12, 2024 23:04 5m 51s
Enable flash attention by default
CI - shark-ai #742: Pull request #690 opened by rsuderman
December 12, 2024 23:04 10m 33s rsuderman:flash_attention_enable
December 12, 2024 23:04 10m 33s
Enable flash attention by default
CI - sharktank #1543: Pull request #690 opened by rsuderman
December 12, 2024 23:04 11m 41s rsuderman:flash_attention_enable
December 12, 2024 23:04 11m 41s
Enable flash attention by default
CI - sharktank perplexity short #478: Pull request #690 opened by rsuderman
December 12, 2024 23:04 3m 52s rsuderman:flash_attention_enable
December 12, 2024 23:04 3m 52s
Update user docs for running llm server + upgrade gguf to 0.11.0
CI - sharktank perplexity short #477: Pull request #676 synchronize by stbaione
December 12, 2024 21:47 4m 2s stbaione:llm-user-docs-update
December 12, 2024 21:47 4m 2s
Update user docs for running llm server + upgrade gguf to 0.11.0
CI - sharktank #1542: Pull request #676 synchronize by stbaione
December 12, 2024 21:47 11m 43s stbaione:llm-user-docs-update
December 12, 2024 21:47 11m 43s
Update user docs for running llm server + upgrade gguf to 0.11.0
CI - shark-ai #741: Pull request #676 synchronize by stbaione
December 12, 2024 21:47 9m 56s stbaione:llm-user-docs-update
December 12, 2024 21:47 9m 56s
Update user docs for running llm server + upgrade gguf to 0.11.0
Llama Benchmarking 8B Tests #1160: Pull request #676 synchronize by stbaione
December 12, 2024 21:47 8m 5s stbaione:llm-user-docs-update
December 12, 2024 21:47 8m 5s
[tuner] Add direct TD spec generation for candidates
CI - sharktank #1541: Pull request #606 synchronize by Max191
December 12, 2024 21:31 11m 51s Max191:tuner-candidate-spec-gen
December 12, 2024 21:31 11m 51s
[tuner] Add direct TD spec generation for candidates
CI - shark-ai #740: Pull request #606 synchronize by Max191
December 12, 2024 21:31 10m 14s Max191:tuner-candidate-spec-gen
December 12, 2024 21:31 10m 14s
[tuner] Add direct TD spec generation for candidates
CI - sharktank perplexity short #476: Pull request #606 synchronize by Max191
December 12, 2024 21:31 4m 6s Max191:tuner-candidate-spec-gen
December 12, 2024 21:31 4m 6s