bartowski commited on
Commit
0338a68
1 Parent(s): cbfe5aa

Add estimated VRAM table

Browse files
Files changed (1) hide show
  1. README.md +7 -15
README.md CHANGED
@@ -15,23 +15,15 @@ Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.13">turb
15
 
16
  Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
17
 
18
- Conversion was done using the default calibration dataset.
19
-
20
- Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
21
-
22
  Original model: https://huggingface.co/abacusai/Smaug-Mixtral-v0.1
23
 
24
-
25
- <a href="https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/6_5">6.5 bits per weight</a>
26
-
27
- <a href="https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/4_25">4.25 bits per weight</a>
28
-
29
- <a href="https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_75">3.75 bits per weight</a>
30
-
31
- <a href="https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_5">3.5 bits per weight</a>
32
-
33
- <a href="https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_0">3.0 bits per weight</a>
34
-
35
 
36
  ## Download instructions
37
 
 
15
 
16
  Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
17
 
 
 
 
 
18
  Original model: https://huggingface.co/abacusai/Smaug-Mixtral-v0.1
19
 
20
+ | Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
21
+ | ------ | ---- | ------------ | ---- | ---- | ---- | ----------- |
22
+ | [6_5](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/6_5) | 6.5 | 8.0 | 38.9 GB | 40.4 GB | 42.4 GB | Near unquantized performance at vastly reduced size, **recommended (if you can run it..)**. |
23
+ | [4_25](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/4_25) | 4.25 | 6.0 | 25.9 GB | 27.4 GB | 29.4 GB | GPTQ equivalent bits per weight, slightly higher quality. |
24
+ | [3_75](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_75) | 3.5 | 6.0 | 23.0 GB | 24.5 GB | 26.5 GB | Lower quality, only use if you have to. |
25
+ | [3_5](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_5) | 3.5 | 6.0 | 21.5 GB | 23.0 GB | 25.0 GB | Lower quality, only use if you have to. |
26
+ | [3_0](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_0) | 3.0 | 6.0 | 18.9 GB | 20.4 GB | 22.4 GB | Very low quality, usable with 16gb of VRAM. |
 
 
 
 
27
 
28
  ## Download instructions
29