forked from abetlen/llama-cpp-python
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1 changed file
with
1 addition
and
1 deletion.
There are no files selected for viewing
Submodule llama.cpp
updated
17 files
+10 −2 | CMakeLists.txt | |
+5 −2 | Makefile | |
+8 −3 | README.md | |
+54 −20 | build.zig | |
+18 −1 | examples/common.cpp | |
+1,356 −918 | examples/server/index.html.hpp | |
+156 −0 | examples/server/public/index.html | |
+0 −2 | flake.nix | |
+36 −6 | ggml-alloc.c | |
+4 −0 | ggml-alloc.h | |
+6 −3 | ggml-metal.h | |
+70 −125 | ggml-metal.m | |
+471 −498 | ggml-metal.metal | |
+160 −65 | llama.cpp | |
+3 −0 | scripts/get-wikitext-2.sh | |
+1 −0 | tests/CMakeLists.txt | |
+403 −0 | tests/test-llama-grammar.cpp |