Logo
Explore Help
Sign In
CS348Project/llama.cpp
5
0
Fork 0
You've already forked llama.cpp
mirror of https://github.com/ggml-org/llama.cpp.git synced 2025-10-27 08:21:30 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
41386cf365d894134ee0813d15e2f5d76f6a4d8e
llama.cpp/.devops
History
Yuannan 1d49ca3759 nix : removed metal for nix (#16118)
2025-10-06 12:29:56 +03:00
..
nix
nix : removed metal for nix (#16118)
2025-10-06 12:29:56 +03:00
cann.Dockerfile
docker : add cann build pipline (#14591)
2025-08-01 10:02:34 +08:00
cpu.Dockerfile
docker : Enable GGML_CPU_ALL_VARIANTS for ARM (#15267)
2025-08-14 16:22:58 +02:00
cuda.Dockerfile
Fix broken build: require updated pip to support --break-system-packages (#15357)
2025-08-18 12:50:48 +02:00
intel.Dockerfile
SYCL: Update to oneAPI 2025.2 (#16371)
2025-10-02 10:16:25 +03:00
llama-cli-cann.Dockerfile
docker : do not build tests (#13204)
2025-04-30 10:44:07 +02:00
llama-cpp-cuda.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
llama-cpp.srpm.spec
repo : update links to new url (#11886)
2025-02-15 16:40:57 +02:00
musa.Dockerfile
ci : fix musa docker build (#16306)
2025-09-28 16:38:15 +02:00
rocm.Dockerfile
CI: reenable cdna in rocm docker builds (#16376)
2025-10-01 23:32:39 +02:00
s390x.Dockerfile
devops: fix s390x docker release failure (#16231)
2025-09-25 11:36:30 +08:00
tools.sh
scripts : make the shell scripts cross-platform (#14341)
2025-06-30 10:17:18 +02:00
vulkan.Dockerfile
vulkan.Dockerfile: install vulkan SDK using tarball (#15282)
2025-08-23 08:58:57 +02:00
Powered by Gitea Version: 1.24.5 Page: 1326ms Template: 23ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API