Skip to content

Actions: Inpher/LocalAI

Build and Release

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
30 workflow runs
30 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

models(gallery): ⬆️ update checksum (#2278)
Build and Release #30: Commit f69de3b pushed by golgeek
May 9, 2024 17:43 32m 59s master
May 9, 2024 17:43 32m 59s
feat(ux): Add chat, tts, and image-gen pages to the WebUI (#2222)
Build and Release #29: Commit 2c5a46b pushed by golgeek
May 2, 2024 20:23 21m 23s master
May 2, 2024 20:23 21m 23s
models(gallery): Add Hermes-2-Pro-Llama-3-8B-GGUF (#2218)
Build and Release #28: Commit f7f8b48 pushed by golgeek
May 2, 2024 18:22 19m 55s master
May 2, 2024 18:22 19m 55s
models(gallery): fixup phi-3 sha
Build and Release #27: Commit 962ebba pushed by golgeek
May 1, 2024 21:10 36m 3s master
May 1, 2024 21:10 36m 3s
feat(models-ui): minor visual enhancements (#2109)
Build and Release #26: Commit d344daf pushed by golgeek
April 23, 2024 17:13 31m 42s master
April 23, 2024 17:13 31m 42s
Merge branch 'master' into golgeek/chat_template
Build and Release #25: Commit 9d59368 pushed by golgeek
April 11, 2024 14:47 19m 31s golgeek/chat_template
April 11, 2024 14:47 19m 31s
Merge upstream/master into branch
Build and Release #24: Commit bde6f6e pushed by golgeek
April 11, 2024 14:46 1m 32s golgeek/chat_template
April 11, 2024 14:46 1m 32s
feat: add flash-attn in nvidia and rocm envs
Build and Release #23: Commit fb7291b pushed by golgeek
April 11, 2024 00:07 17m 23s golgeek/flash-attn
April 11, 2024 00:07 17m 23s
feat: add flash-attn in nvidia and rocm envs
Build and Release #22: Commit 85b643f pushed by golgeek
April 10, 2024 23:56 10m 48s golgeek/flash-attn
April 10, 2024 23:56 10m 48s
Use tokenizer.apply_chat_template() in vLLM
Build and Release #21: Commit 7be926e pushed by golgeek
April 10, 2024 22:05 30m 2s golgeek/chat_template
April 10, 2024 22:05 30m 2s
Use tokenizer.apply_chat_template() in vLLM
Build and Release #20: Commit bea09bb pushed by golgeek
April 10, 2024 20:11 19m 45s golgeek/chat_template
April 10, 2024 20:11 19m 45s
Use tokenizer.apply_chat_template() in vLLM
Build and Release #19: Commit fda8b15 pushed by golgeek
April 10, 2024 19:38 33m 25s golgeek/chat_template
April 10, 2024 19:38 33m 25s
April 10, 2024 19:24 13m 39s
April 10, 2024 17:32 29m 52s
Use tokenizer.apply_chat_template() in vLLM
Build and Release #13: Commit 42e44f2 pushed by golgeek
April 10, 2024 15:09 2m 32s golgeek/chat_template
April 10, 2024 15:09 2m 32s
Option to use tokenizer template
Build and Release #12: Commit 5cace86 pushed by golgeek
April 10, 2024 15:00 9m 18s golgeek/chat_template
April 10, 2024 15:00 9m 18s
Option to use tokenizer template
Build and Release #11: Commit 052d6d9 pushed by golgeek
April 10, 2024 14:41 20m 18s golgeek/chat_template
April 10, 2024 14:41 20m 18s
pass messages to backend
Build and Release #10: Commit 2be380f pushed by golgeek
April 9, 2024 23:28 17m 20s golgeek/chat_template
April 9, 2024 23:28 17m 20s
ci: fixup latest image push
Build and Release #9: Commit cc3d601 pushed by golgeek
April 9, 2024 23:02 48m 42s master
April 9, 2024 23:02 48m 42s
Bump vllm
Build and Release #8: Commit bcc6a0f pushed by golgeek
April 9, 2024 23:00 28m 0s golgeek/chat_template
April 9, 2024 23:00 28m 0s
Use tokenizer.apply_chat_template() in vLLM
Build and Release #7: Commit e3e414e pushed by golgeek
April 9, 2024 22:58 2m 0s golgeek/chat_template
April 9, 2024 22:58 2m 0s
Fix install exllama
Build and Release #6: Commit 7b18845 pushed by golgeek
March 1, 2024 17:24 22m 46s golgeek/bump-vllm
March 1, 2024 17:24 22m 46s