Llamacpp quants
Browse files- .gitattributes +12 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q2_K.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q3_K_L.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q3_K_M.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q3_K_S.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q4_0.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q4_K_M.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q4_K_S.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q5_0.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q5_K_M.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q5_K_S.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q6_K.gguf +3 -0
- LiberatedHermes-2-Pro-Mistral-7B-Q8_0.gguf +3 -0
- README.md +32 -0
.gitattributes
CHANGED
@@ -33,3 +33,15 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
47 |
+
LiberatedHermes-2-Pro-Mistral-7B-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
LiberatedHermes-2-Pro-Mistral-7B-Q2_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ef48a716c34c26aef6461379362bb06ee04733f8bb1ea961f6b89621d36fe809
|
3 |
+
size 2719393856
|
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:379aa4f4f7141dff097e9ba55b7f6802448eddd22bed1587d91a5169636f445b
|
3 |
+
size 3822189632
|
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fc4955f0a3654251068da6efb56a7bd27d2fd36cf36482afc072658240cb8c6c
|
3 |
+
size 3519151168
|
LiberatedHermes-2-Pro-Mistral-7B-Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c7cca0e86962e5928d29e6f6be68bd0001a392dd80617b0993ead8bc6564cef3
|
3 |
+
size 3164732480
|
LiberatedHermes-2-Pro-Mistral-7B-Q4_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:15577ba11c955e3c4cd83c1b490e138b5404e89dd17ad431c1f532ba87c1ad09
|
3 |
+
size 4109099072
|
LiberatedHermes-2-Pro-Mistral-7B-Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:930a414e3b895dcdec09a109fed2340ec70707978d29cbea275f26243f2fe50c
|
3 |
+
size 4368621632
|
LiberatedHermes-2-Pro-Mistral-7B-Q4_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6a33c66cab6794d481f91f9324da87acac4e9e01be9be44f6f5efb915e0637ba
|
3 |
+
size 4140556352
|
LiberatedHermes-2-Pro-Mistral-7B-Q5_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6cd8bdf24c6f3364facaeca26c9d82fa53d709d3b36e9c68305f8b86f471f667
|
3 |
+
size 4997914688
|
LiberatedHermes-2-Pro-Mistral-7B-Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:aa698e507754325a0d94f8712807d67a8cf17243ee752ab3e1f0bca873c26563
|
3 |
+
size 5131608128
|
LiberatedHermes-2-Pro-Mistral-7B-Q5_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a79c17104c518b3209137aabee199466a60b2359bc8b9f586ab40ec413033a89
|
3 |
+
size 4997914688
|
LiberatedHermes-2-Pro-Mistral-7B-Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b892a668dde911cee73c029578ebbae63d92734cd4820f05c9fa3547ba5491fb
|
3 |
+
size 5942281280
|
LiberatedHermes-2-Pro-Mistral-7B-Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7a7b52ff8b27ca20733f5d0092e5eeb4aad87b0c9811839b2d655aef0d163fbb
|
3 |
+
size 7696137280
|
README.md
ADDED
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model: NousResearch/Hermes-2-Pro-Mistral-7B
|
4 |
+
datasets: abacusai/SystemChat
|
5 |
+
quantized_by: bartowski
|
6 |
+
pipeline_tag: text-generation
|
7 |
+
---
|
8 |
+
|
9 |
+
## Llamacpp Quantizations of LiberatedHermes-2-Pro-Mistral-7B
|
10 |
+
|
11 |
+
Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2440">b2440</a> for quantization.
|
12 |
+
|
13 |
+
Original model: https://huggingface.co/macadeliccc/LiberatedHermes-2-Pro-Mistral-7B
|
14 |
+
|
15 |
+
Download a file (not the whole branch) from below:
|
16 |
+
|
17 |
+
| Filename | Quant type | File Size | Description |
|
18 |
+
| -------- | ---------- | --------- | ----------- |
|
19 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q8_0.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q8_0.gguf) | Q8_0 | 7.69GB | Extremely high quality, generally unneeded but max available quant. |
|
20 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q6_K.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q6_K.gguf) | Q6_K | 5.94GB | Very high quality, near perfect, *recommended*. |
|
21 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q5_K_M.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q5_K_M.gguf) | Q5_K_M | 5.13GB | High quality, very usable. |
|
22 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q5_K_S.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q5_K_S.gguf) | Q5_K_S | 4.99GB | High quality, very usable. |
|
23 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q5_0.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q5_0.gguf) | Q5_0 | 4.99GB | High quality, older format, generally not recommended. |
|
24 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q4_K_M.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q4_K_M.gguf) | Q4_K_M | 4.36GB | Good quality, similar to 4.25 bpw. |
|
25 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q4_K_S.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q4_K_S.gguf) | Q4_K_S | 4.14GB | Slightly lower quality with small space savings. |
|
26 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q4_0.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q4_0.gguf) | Q4_0 | 4.10GB | Decent quality, older format, generally not recommended. |
|
27 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q3_K_L.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q3_K_L.gguf) | Q3_K_L | 3.82GB | Lower quality but usable, good for low RAM availability. |
|
28 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q3_K_M.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q3_K_M.gguf) | Q3_K_M | 3.51GB | Even lower quality. |
|
29 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q3_K_S.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q3_K_S.gguf) | Q3_K_S | 3.16GB | Low quality, not recommended. |
|
30 |
+
| [LiberatedHermes-2-Pro-Mistral-7B-Q2_K.gguf](https://huggingface.co/bartowski/LiberatedHermes-2-Pro-Mistral-7B-GGUF/blob/main/LiberatedHermes-2-Pro-Mistral-7B-Q2_K.gguf) | Q2_K | 2.71GB | Extremely low quality, *not* recommended.
|
31 |
+
|
32 |
+
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|