mirror of
				https://github.com/ggml-org/llama.cpp.git
				synced 2025-10-30 08:42:00 +00:00 
			
		
		
		
	Add link to Roadmap discussion
This commit is contained in:
		| @@ -7,8 +7,8 @@ Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++ | |||||||
|  |  | ||||||
| **Hot topics:** | **Hot topics:** | ||||||
|  |  | ||||||
|  | - [Roadmap (short-term)](https://github.com/ggerganov/llama.cpp/discussions/457) | ||||||
| - New C-style API is now available: https://github.com/ggerganov/llama.cpp/pull/370 | - New C-style API is now available: https://github.com/ggerganov/llama.cpp/pull/370 | ||||||
| - [Added Alpaca support](https://github.com/ggerganov/llama.cpp#instruction-mode-with-alpaca) |  | ||||||
| - Cache input prompts for faster initialization: https://github.com/ggerganov/llama.cpp/issues/64 | - Cache input prompts for faster initialization: https://github.com/ggerganov/llama.cpp/issues/64 | ||||||
| - Create a `llama.cpp` logo: https://github.com/ggerganov/llama.cpp/issues/105 | - Create a `llama.cpp` logo: https://github.com/ggerganov/llama.cpp/issues/105 | ||||||
|  |  | ||||||
|   | |||||||
		Reference in New Issue
	
	Block a user
	 Georgi Gerganov
					Georgi Gerganov