Replies: 4 comments
-
It's really a good idea. |
Beta Was this translation helpful? Give feedback.
-
I agree! Llava is even better - better training data and performance. Plus also already has many good quantised models available. https://huggingface.co/Hyunel/llava-13b-v1-1-4bit-128g/ Would anyone be able to describe what would be needed tot add image support to llama.cpp? Myself U can only do python unfortunately, but am really curious how hard it would be. |
Beta Was this translation helpful? Give feedback.
-
Would acceleration be possible also for tesseract https://github.com/tesseract-ocr/tesseract ? |
Beta Was this translation helpful? Give feedback.
-
duplicate (mostly) of #1050 |
Beta Was this translation helpful? Give feedback.
-
This https://github.com/Vision-CAIR/MiniGPT-4 project used vikuna (llama based weights) model for training (frosen weights, trained only first layer, as i understand).
Beta Was this translation helpful? Give feedback.
All reactions