This website requires JavaScript.
Explore
Help
Sign In
CS348Project
/
llama.cpp
Watch
5
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-11-08 10:07:01 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
99161230c44fae7d4bac07b899d9b1f1e20ef407
llama.cpp
/
common
History
Georgi Gerganov
99161230c4
llama : enable GPU inference by default with Metal
2023-09-03 10:30:53 +03:00
..
CMakeLists.txt
gguf : new file format with flexible meta data (beta) (
#2398
)
2023-08-21 23:07:43 +03:00
common.cpp
llama : enable GPU inference by default with Metal
2023-09-03 10:30:53 +03:00
common.h
llama : enable GPU inference by default with Metal
2023-09-03 10:30:53 +03:00
console.cpp
build : fix most gcc and clang warnings (
#2861
)
2023-09-01 16:34:50 +03:00
console.h
gguf : new file format with flexible meta data (beta) (
#2398
)
2023-08-21 23:07:43 +03:00
grammar-parser.cpp
gguf : new file format with flexible meta data (beta) (
#2398
)
2023-08-21 23:07:43 +03:00
grammar-parser.h
gguf : new file format with flexible meta data (beta) (
#2398
)
2023-08-21 23:07:43 +03:00
log.h
logging: Fix creating empty file even when disabled (
#2966
)
2023-09-02 11:53:55 -06:00