mirror of
				https://github.com/ggml-org/llama.cpp.git
				synced 2025-10-31 08:51:55 +00:00 
			
		
		
		
	 e2bd725f4b
			
		
	
	e2bd725f4b
	
	
	
		
			
			* fix oai proxy fix generation not stoped while bot stop talking in chat mode fix possible `slot_id` not exist response for cors (and pre flight) * oai proxy: workaround for some client (such as Chatbox) * use stop as separator to replace hardcoded `\n`
		
			
				
	
	
	
		
			10 KiB
		
	
	
	
	
		
			Executable File
		
	
	
	
	
			
		
		
	
	
			10 KiB
		
	
	
	
	
		
			Executable File