File size: 2,869 Bytes
cbfe5aa e41a629 cbfe5aa 0338a68 5c28e7a cbfe5aa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
license: apache-2.0
tags:
- mixtral
- finetune
quantized_by: bartowski
pipeline_tag: text-generation
---
## Exllama v2 Quantizations of Smaug-Mixtral-v0.1
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.13">turboderp's ExLlamaV2 v0.0.13</a> for quantization.
<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/abacusai/Smaug-Mixtral-v0.1
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | VRAM (32k) | Description |
| ------ | ---- | ------------ | ---- | ---- | ---- | ----------- |
| [6_5](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/6_5) | 6.5 | 8.0 | 38.9 GB | 40.4 GB | 42.4 GB | Near unquantized performance at vastly reduced size, **recommended (if you can run it..)**. |
| [4_25](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/4_25) | 4.25 | 6.0 | 25.9 GB | 27.4 GB | 29.4 GB | GPTQ equivalent bits per weight, slightly higher quality. |
| [3_75](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_75) | 3.5 | 6.0 | 23.0 GB | 24.5 GB | 26.5 GB | Lower quality, but pretty usable. Good for 4k context on 24GB. |
| [3_5](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_5) | 3.5 | 6.0 | 21.5 GB | 23.0 GB | 25.0 GB | Lower quality, only use if you need more context on 24GB. |
| [3_0](https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2/tree/3_0) | 3.0 | 6.0 | 18.9 GB | 20.4 GB | 22.4 GB | Very low quality, pushes context to max but likely unusable. |
## Download instructions
With git:
```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Smaug-Mixtral-v0.1-exl2
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Smaug-Mixtral-v0.1-exl2`:
```shell
mkdir Smaug-Mixtral-v0.1-exl2
huggingface-cli download bartowski/Smaug-Mixtral-v0.1-exl2 --local-dir Smaug-Mixtral-v0.1-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
Linux:
```shell
mkdir Smaug-Mixtral-v0.1-exl2-6_5
huggingface-cli download bartowski/Smaug-Mixtral-v0.1-exl2 --revision 6_5 --local-dir Smaug-Mixtral-v0.1-exl2-6_5 --local-dir-use-symlinks False
```
Windows (which apparently doesn't like _ in folders sometimes?):
```shell
mkdir Smaug-Mixtral-v0.1-exl2-6.5
huggingface-cli download bartowski/Smaug-Mixtral-v0.1-exl2 --revision 6_5 --local-dir Smaug-Mixtral-v0.1-exl2-6.5 --local-dir-use-symlinks False
``` |