Files
llama.cpp/examples/server/tests/unit/test_chat_completion.py
Xuan Son Nguyen 45095a61bf server : clean up built-in template detection (#11026)
* server : clean up built-in template detection

* fix compilation

* add chat template test

* fix condition
2024-12-31 15:22:01 +01:00

9.2 KiB