mradermacher commited on
Commit
a6c3fe7
1 Parent(s): b4088bc

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -2,6 +2,7 @@
2
  base_model: nitky/Superswallow-70b-NVE
3
  language:
4
  - en
 
5
  library_name: transformers
6
  license: llama2
7
  model_type: llama
@@ -43,7 +44,6 @@ more details, including on how to concatenate multi-part files.
43
  | [PART 1](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q6_K.gguf.part2of2) | Q6_K | 57.0 | very good quality |
44
  | [PART 1](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q8_0.gguf.part2of2) | Q8_0 | 73.6 | fast, best quality |
45
 
46
-
47
  Here is a handy graph by ikawrakow comparing some lower-quality quant
48
  types (lower is better):
49
 
 
2
  base_model: nitky/Superswallow-70b-NVE
3
  language:
4
  - en
5
+ - ja
6
  library_name: transformers
7
  license: llama2
8
  model_type: llama
 
44
  | [PART 1](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q6_K.gguf.part2of2) | Q6_K | 57.0 | very good quality |
45
  | [PART 1](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Superswallow-70b-NVE-GGUF/resolve/main/Superswallow-70b-NVE.Q8_0.gguf.part2of2) | Q8_0 | 73.6 | fast, best quality |
46
 
 
47
  Here is a handy graph by ikawrakow comparing some lower-quality quant
48
  types (lower is better):
49