mirror of
				https://github.com/ggml-org/llama.cpp.git
				synced 2025-10-30 08:42:00 +00:00 
			
		
		
		
	Add Nix and Flox install instructions (#7899)
This commit is contained in:
		
							
								
								
									
										24
									
								
								README.md
									
									
									
									
									
								
							
							
						
						
									
										24
									
								
								README.md
									
									
									
									
									
								
							| @@ -387,6 +387,30 @@ brew install llama.cpp | ||||
| ``` | ||||
| The formula is automatically updated with new `llama.cpp` releases. More info: https://github.com/ggerganov/llama.cpp/discussions/7668 | ||||
|  | ||||
| ### Nix | ||||
|  | ||||
| On Mac and Linux, the Nix package manager can be used via | ||||
| ``` | ||||
| nix profile install nixpkgs#llama-cpp | ||||
| ``` | ||||
| For flake enabled installs. | ||||
|  | ||||
| Or | ||||
| ``` | ||||
| nix-env --file '<nixpkgs>' --install --attr llama-cpp | ||||
| ``` | ||||
| For non-flake enabled installs. | ||||
|  | ||||
| This expression is automatically updated within the [nixpkgs repo](https://github.com/NixOS/nixpkgs/blob/nixos-24.05/pkgs/by-name/ll/llama-cpp/package.nix#L164). | ||||
|  | ||||
| #### Flox | ||||
|  | ||||
| On Mac and Linux, Flox can be used to install llama.cpp within a Flox environment via | ||||
| ``` | ||||
| flox install llama-cpp | ||||
| ``` | ||||
| Flox follows the nixpkgs build of llama.cpp. | ||||
|  | ||||
| ### Metal Build | ||||
|  | ||||
| On MacOS, Metal is enabled by default. Using Metal makes the computation run on the GPU. | ||||
|   | ||||
		Reference in New Issue
	
	Block a user
	 Bryan Honof
					Bryan Honof