Files
llama.cpp/tools/main/main.cpp
Diego Devesa 27ebfcacba llama : do not crash if there is no CPU backend (#13395)
* llama : do not crash if there is no CPU backend

* add checks to examples
2025-05-09 13:02:07 +02:00

39 KiB