|
# Phind-CodeLlama-34B-v2 EXL2 |
|
|
|
Weights of [Phind-CodeLlama-34B-v2](https://huggingface.co/Phind/Phind-CodeLlama-34B-v2) converted |
|
to [EXL2](https://github.com/turboderp/exllamav2#exl2-quantization) format. |
|
|
|
Each separate quant is in a different branch, like in The Bloke's GPTQ repos. |
|
|
|
``` |
|
export BRANCH=5_0-bpw-h8 |
|
git clone --single-branch --branch ${BRANCH} https://huggingface.co/latimar/Phind-Codellama-34B-v2-exl2 |
|
``` |
|
|
|
There are the following branches: |
|
|
|
``` |
|
5_0-bpw-h8 |
|
4_625-bpw-h6 |
|
4_125-bpw-h6 |
|
2_75-bpw-h6 |
|
2_55-bpw-h6 |
|
``` |
|
|