mirror of
				https://github.com/ggml-org/llama.cpp.git
				synced 2025-10-31 08:51:55 +00:00 
			
		
		
		
	Add missing struct annotation (#483)
`llama_sample_top_p_top_k` was missing the struct annotation on line 126. This causes a compiler issue when being parsed by the Kotlin C interop generator. This commit fixes the above issue by adding the struct annotation.
This commit is contained in:
		
							
								
								
									
										2
									
								
								llama.h
									
									
									
									
									
								
							
							
						
						
									
										2
									
								
								llama.h
									
									
									
									
									
								
							| @@ -123,7 +123,7 @@ extern "C" { | |||||||
|  |  | ||||||
|     // TODO: improve the last_n_tokens interface ? |     // TODO: improve the last_n_tokens interface ? | ||||||
|     LLAMA_API llama_token llama_sample_top_p_top_k( |     LLAMA_API llama_token llama_sample_top_p_top_k( | ||||||
|               llama_context * ctx, |        struct llama_context * ctx, | ||||||
|           const llama_token * last_n_tokens_data, |           const llama_token * last_n_tokens_data, | ||||||
|                         int   last_n_tokens_size, |                         int   last_n_tokens_size, | ||||||
|                         int   top_k, |                         int   top_k, | ||||||
|   | |||||||
		Reference in New Issue
	
	Block a user
	 Doomsdayrs
					Doomsdayrs