apepkuss79 commited on
Commit
3576064
1 Parent(s): 202df49

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -75,6 +75,9 @@ tags:
75
  | [Qwen1.5-110B-Chat-Q2_K.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q2_K.gguf) | Q2_K | 2 | 41.2 GB| smallest, significant quality loss - not recommended for most purposes |
76
  | [Qwen1.5-110B-Chat-Q3_K_M-00001-of-00002.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q3_K_M-00001-of-00002.gguf) | Q3_K_M | 3 | 32.2 GB| very small, high quality loss |
77
  | [Qwen1.5-110B-Chat-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 21.5 GB| very small, high quality loss |
 
 
 
78
  | [Qwen1.5-110B-Chat-Q5_K_M-00001-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00001-of-00003.gguf) | Q5_K_M | 5 | 32.1 GB| large, very low quality loss - recommended |
79
  | [Qwen1.5-110B-Chat-Q5_K_M-00002-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00002-of-00003.gguf) | Q5_K_M | 5 | 32 GB| large, very low quality loss - recommended |
80
  | [Qwen1.5-110B-Chat-Q5_K_M-00003-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00003-of-00003.gguf) | Q5_K_M | 5 | 14.8 GB| large, very low quality loss - recommended |
 
75
  | [Qwen1.5-110B-Chat-Q2_K.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q2_K.gguf) | Q2_K | 2 | 41.2 GB| smallest, significant quality loss - not recommended for most purposes |
76
  | [Qwen1.5-110B-Chat-Q3_K_M-00001-of-00002.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q3_K_M-00001-of-00002.gguf) | Q3_K_M | 3 | 32.2 GB| very small, high quality loss |
77
  | [Qwen1.5-110B-Chat-Q3_K_M-00002-of-00002.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q3_K_M-00002-of-00002.gguf) | Q3_K_M | 3 | 21.5 GB| very small, high quality loss |
78
+ | [Qwen1.5-110B-Chat-Q4_K_M-00001-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q4_K_M-00001-of-00003.gguf) | Q4_K_M | 4 | 32 GB| medium, balanced quality - recommended |
79
+ | [Qwen1.5-110B-Chat-Q4_K_M-00002-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q4_K_M-00002-of-00003.gguf) | Q4_K_M | 4 | 32.1 GB| medium, balanced quality - recommended |
80
+ | [Qwen1.5-110B-Chat-Q4_K_M-00003-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q4_K_M-00003-of-00003.gguf) | Q4_K_M | 4 | 3.09 GB| medium, balanced quality - recommended |
81
  | [Qwen1.5-110B-Chat-Q5_K_M-00001-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00001-of-00003.gguf) | Q5_K_M | 5 | 32.1 GB| large, very low quality loss - recommended |
82
  | [Qwen1.5-110B-Chat-Q5_K_M-00002-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00002-of-00003.gguf) | Q5_K_M | 5 | 32 GB| large, very low quality loss - recommended |
83
  | [Qwen1.5-110B-Chat-Q5_K_M-00003-of-00003.gguf](https://huggingface.co/second-state/Qwen1.5-110B-Chat-GGUF/blob/main/Qwen1.5-110B-Chat-Q5_K_M-00003-of-00003.gguf) | Q5_K_M | 5 | 14.8 GB| large, very low quality loss - recommended |