Is there a llama.cpp inference repository?
#1
by
percisestretch
- opened
I want to reason with these gguf files, but it seems that I need the reasoning script code of llama.cpp.
that's not; at this stage, those gguf files are for developer(s) to work on the inference probably
I think for llama.cpp usage we need a mmproj for this model?
vision tensors are inside the file; you could wait the llama.cpp side or use llava-v1.5-13b with mmproj, etc. instead for the time being
hi all, it works now; try the new ggc f6; select the gguf (from this repo) straight in your current directory to interact with
hi all, it works now; try the new
ggc f6; select the gguf (from this repo) straight in your current directory to interact with
Could you give me more details? Thank you very much!
can we run it in llama.cpp?