Georgi Gerganov 
							
						 
					 
					
						
						
							
						
						d3ae0ee8d7 
					 
					
						
						
							
							py : fix requirements check '==' -> '~=' ( #8982 )  
						
						 
						
						... 
						
						
						
						* py : fix requirements check '==' -> '~='
* cont : fix the fix
* ci : run on all requirements.txt 
						
						
					 
					
						2024-08-12 11:02:01 +03:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								tc-mb 
							
						 
					 
					
						
						
							
						
						3071c0a5f2 
					 
					
						
						
							
							llava : support MiniCPM-V-2.5 ( #7599 )  
						
						 
						
						... 
						
						
						
						* init
* rename
* add run android for termux in readme
* add android readme
* add instructions in readme
* change name in readme
* Update README.md
* fixed line
* add result in readme
* random pos_embed
* add positions index
* change for ollama
* change for ollama
* better pos_embed in clip
* support ollama
* updata cmakelist
* updata cmakelist
* rename wrapper
* clear code
* replace and organize code
* add link
* sync master
* fix warnings
* fix warnings
* fix bug in bicubic resize when need resize iamge smaller
* receive review comments and modify
* receive review comments and modify
* put all code into llava dir
* fix quality problem in pr code
* change n_layer
* add space in "-1"
* imitate reshape bug of python code
* fix bug in clip
* fix issues for merging
* fix llama-minicpmv-cli in cmake file
* change pr readme
* fix code review
* remove in line 33 directory in the /cmakelists.txt (not in example, in the main dir
* fix cmakefile
* add warn
* fix KEY_HAS_MINICPMV_PROJ
* remove load_image_size into clip_ctx
* remove the extern "C", MINICPMV_API
* fix uhd code for review comment
* delete minicpmv-wrapper in pr
* remove uhd_image_embed
* Modify 2 notes
* clip : style changes
* del common.h in clip
* fix Type-Check error
* fix Type-Check error
* fix Type-Check error
* fix Type-Check error
* fix makefile error
* fix ubuntu-make error
* try fix clip
* try fix 1
---------
Co-authored-by: Hongji Zhu <fireyoucan@gmail.com >
Co-authored-by: harvestingmoon <leewenyeong@gmail.com >
Co-authored-by: Georgi Gerganov <ggerganov@gmail.com > 
						
						
					 
					
						2024-08-09 13:33:53 +03:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								compilade 
							
						 
					 
					
						
						
							
						
						d39130a398 
					 
					
						
						
							
							py : use cpu-only torch in requirements.txt ( #8335 )  
						
						 
						
						
						
						
					 
					
						2024-07-07 14:23:38 +03:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								Georgi Gerganov 
							
						 
					 
					
						
						
							
						
						e235b267a2 
					 
					
						
						
							
							py : switch to snake_case ( #8305 )  
						
						 
						
						... 
						
						
						
						* py : switch to snake_case
ggml-ci
* cont
ggml-ci
* cont
ggml-ci
* cont : fix link
* gguf-py : use snake_case in scripts entrypoint export
* py : rename requirements for convert_legacy_llama.py
Needed for scripts/check-requirements.sh
---------
Co-authored-by: Francis Couture-Harpin <git@compilade.net > 
						
						
					 
					
						2024-07-05 07:53:33 +03:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								ditsuke 
							
						 
					 
					
						
						
							
						
						07786a61a2 
					 
					
						
						
							
							chore: Fixup requirements and build  
						
						 
						
						
						
						
					 
					
						2024-07-04 15:39:13 +00:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								Galunid 
							
						 
					 
					
						
						
							
						
						9c4c9cc83f 
					 
					
						
						
							
							Move convert.py to examples/convert-legacy-llama.py ( #7430 )  
						
						 
						
						... 
						
						
						
						* Move convert.py to examples/convert-no-torch.py
* Fix CI, scripts, readme files
* convert-no-torch -> convert-legacy-llama
* Move vocab thing to vocab.py
* Fix convert-no-torch -> convert-legacy-llama
* Fix lost convert.py in ci/run.sh
* Fix imports
* Fix gguf not imported correctly
* Fix flake8 complaints
* Fix check-requirements.sh
* Get rid of ADDED_TOKENS_FILE, FAST_TOKENIZER_FILE
* Review fixes 
						
						
					 
					
						2024-05-30 21:40:00 +10:00  
					
					
						 
						
						
							
							
							 
							
							
							
							
							 
						
					 
				 
			
				
					
						
							
							
								 
								Daniel Bevenius 
							
						 
					 
					
						
						
							
						
						e00d2a62dd 
					 
					
						
						
							
							llava : add requirements.txt and update README.md ( #5428 )  
						
						 
						
						... 
						
						
						
						* llava: add requirements.txt and update README.md
This commit adds a `requirements.txt` file to the `examples/llava`
directory. This file contains the required Python packages to run the
scripts in the `examples/llava` directory.
The motivation of this to make it easier for users to run the scripts in
`examples/llava`. This will avoid users from having to possibly run into
missing package issues if the packages are not installed on their system.
Signed-off-by: Daniel Bevenius <daniel.bevenius@gmail.com >
* llava: fix typo in llava-surgery.py output
Signed-off-by: Daniel Bevenius <daniel.bevenius@gmail.com >
---------
Signed-off-by: Daniel Bevenius <daniel.bevenius@gmail.com > 
						
						
					 
					
						2024-02-09 15:00:59 +02:00