llama.cpp fix : 'no' is a reserved YAML keyword and so must be put in quotation marks
#1
by
nicoboss
- opened
For this model to be compatible with llama.cpp the metadata inside the README.md must follow valid YAML syntax. According to https://yaml.org/type/bool.html "no" is a reserved keyword and must be put in quotation marks.
If this is not done llama.cpp interprets the languages like this:
general.languages = ['ar', ..., 'sa', False, 'gn', ..., 'ha']
And crashes with the following error:
INFO:hf-to-gguf:Set model quantization version
INFO:gguf.gguf_writer:Writing the following files:
INFO:gguf.gguf_writer:Apollo2-9B.SOURCE.gguf: n_tensors = 464, total_size = 18.5G
general.architecture = gemma2
general.type = model
general.name = Apollo2 9B
general.basename = Apollo2
general.size_label = 9B
general.license = apache-2.0
general.base_model.count = 1
general.base_model.0.name = Gemma 2 9b
general.base_model.0.organization = Google
general.base_model.0.repo_url = https://huggingface.co/google/gemma-2-9b
general.tags = ['biology', 'medical', 'question-answering']
general.languages = ['ar', 'en', 'zh', 'ko', 'ja', 'mn', 'th', 'vi', 'lo', 'mg', 'de', 'pt', 'es', 'fr', 'ru', 'it', 'hr', 'gl', 'cs', 'co', 'la', 'uk', 'bs', 'bg', 'eo', 'sq', 'da', 'sa', False, 'gn', 'sr', 'sk', 'gd', 'lb', 'hi', 'ku', 'mt', 'he', 'ln', 'bm', 'sw', 'ig', 'rw', 'ha']
Traceback (most recent call last):
File "/root/llama.cpp/convert_hf_to_gguf.py", line 4430, in <module>
main()
File "/root/llama.cpp/convert_hf_to_gguf.py", line 4424, in main
model_instance.write()
File "/root/llama.cpp/convert_hf_to_gguf.py", line 436, in write
self.gguf_writer.write_kv_data_to_file()
File "/root/llama.cpp/gguf-py/gguf/gguf_writer.py", line 241, in write_kv_data_to_file
kv_bytes += self._pack_val(val.value, val.type, add_vtype=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/llama.cpp/gguf-py/gguf/gguf_writer.py", line 885, in _pack_val
raise ValueError("All items in a GGUF array should be of the same type")
ValueError: All items in a GGUF array should be of the same type
Please read https://huggingface.co/mradermacher/model_requests/discussions/370 for more information.
Thanks a lot
Xidong
changed pull request status to
merged