Fixed a few typos in the README of the LLaMA.cpp HTTP Server [no ci] (#16297)

This commit is contained in:
Imad Saddik
2025-09-28 12:04:46 +01:00
committed by GitHub
parent d8359f5fde
commit 2811c65286

View File

@@ -391,7 +391,7 @@ node index.js
## API Endpoints ## API Endpoints
### GET `/health`: Returns heath check result ### GET `/health`: Returns health check result
This endpoint is public (no API key check). This endpoint is public (no API key check).
@@ -846,7 +846,7 @@ To use this endpoint with POST method, you need to start server with `--props`
### POST `/embeddings`: non-OpenAI-compatible embeddings API ### POST `/embeddings`: non-OpenAI-compatible embeddings API
This endpoint supports all poolings, including `--pooling none`. When the pooling is `none`, the responses will contain the *unnormalized* embeddings for *all* input tokens. For all other pooling types, only the pooled embeddings are returned, normalized using Euclidian norm. This endpoint supports all poolings, including `--pooling none`. When the pooling is `none`, the responses will contain the *unnormalized* embeddings for *all* input tokens. For all other pooling types, only the pooled embeddings are returned, normalized using Euclidean norm.
Note that the response format of this endpoint is different from `/v1/embeddings`. Note that the response format of this endpoint is different from `/v1/embeddings`.