server : update docs (#13432)

This commit is contained in:
Xuan-Son Nguyen
2025-05-10 18:44:49 +02:00
committed by GitHub
parent 43dfd741a5
commit 3b24d26c22
2 changed files with 40 additions and 16 deletions

View File

@@ -6,7 +6,7 @@ llama.cpp supports multimodal input via `libmtmd`. Currently, there are 2 tools
To enable it, can use use one of the 2 methods below:
- Use `-hf` option with a [supported model](../../docs/multimodal.md)
- Use `-hf` option with a supported model (see a list of pre-quantized model below)
- To load a model using `-hf` while disabling multimodal, use `--no-mmproj`
- To load a model using `-hf` while using a custom mmproj file, use `--mmproj local_file.gguf`
- Use `-m model.gguf` option with `--mmproj file.gguf` to specify text and multimodal projector respectively