-
Notifications
You must be signed in to change notification settings - Fork 9
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Signed-off-by: thxCode <[email protected]>
- Loading branch information
Showing
1 changed file
with
1 addition
and
1 deletion.
There are no files selected for viewing
Submodule llama.cpp
updated
19 files
+2 −0 | .devops/nix/package.nix | |
+4 −2 | .github/workflows/build.yml | |
+10 −1 | CMakeLists.txt | |
+29 −6 | Makefile | |
+27 −2 | cmake/llama-config.cmake.in | |
+71 −46 | convert_hf_to_gguf.py | |
+39 −0 | docs/build.md | |
+2 −0 | ggml/.gitignore | |
+0 −220 | ggml/ggml_vk_generate_shaders.py | |
+32 −5 | ggml/src/CMakeLists.txt | |
+14 −14 | ggml/src/ggml-metal.m | |
+179 −550 | ggml/src/ggml-metal.metal | |
+0 −144,956 | ggml/src/ggml-vulkan-shaders.hpp | |
+5 −0 | ggml/src/vulkan-shaders/CMakeLists.txt | |
+524 −0 | ggml/src/vulkan-shaders/vulkan-shaders-gen.cpp | |
+19 −8 | gguf-py/scripts/gguf_hash.py | |
+20 −16 | src/llama.cpp | |
+2 −2 | tests/test-tokenizer-0.cpp | |
+4 −3 | tests/test-tokenizer-random.py |