kennylam's picture
Create README.md
455397e verified
|
raw
history blame
1.57 kB

CantoneseLLM-6B-preview202402 with ExLlamaV2 Quantization

哩個係用 /hon9kon9ize/CantoneseLLM-6B-preview202402 生成嘅exl2量化模型。
This is a quantizated model from /hon9kon9ize/CantoneseLLM-6B-preview202402 in exl2 format.
這是一個由 /hon9kon9ize/CantoneseLLM-6B-preview202402 生成的exl2量化模型。

哩度係main branch, 只係放EvLlamaV2量化果陣用到嘅measurement.json檔案,請響下面揀量化程度。
You are currently at the main branch, which provides only measurement.json used in the ExLlamaV2 quantization. Please take a look of your choices in following table of branches.
這裡是main branch, 只提供EvLlamaV2量化時所用到的measurement.json檔案,請在下面選擇量化程度。。

8.0bpw-h8 8 bits per weight.

6.0bpw-h6 6 bits per weight.

5.0bpw-h6 4 bits per weight.

4.0bpw-h6 4 bits per weight.

3.0bpw-h6 3 bits per weight.