GUFF for CausalLM/35b-beta-long?
#1
by
Elfrino
- opened
Hello guys,
I was just wondering if it's possible to get some GGUFs for this model?:
https://huggingface.co/CausalLM/35b-beta-long
It appears to be one of the few fine-tuned Command-R 35b models, showing promising test results.
Thankyou in advance.
Sorry about the late reply. As of now Command R conversion doesn't seem to work after the llama.cpp BPE update. Will do this once a fix is released
Now worries.
I think they merged a fix yesterday. :)
They will be up here in a few mins - QuantFactory/CausalLM-35b-beta-long-GGUF
Awesome, thankyou! :)