metadata
license: cc-by-nc-4.0
base_model: spow12/Ko-Qwen2-7B-Instruct
tags:
- gguf
model-index:
- name: joongi007/Ko-Qwen2-7B-Instruct-GGUF
results: []
- Original model is spow12/Ko-Qwen2-7B-Instruct
- quantized using llama.cpp - b3510
<|im_start|>system
{System}<|im_end|>
<|im_start|>user
{User}<|im_end|>
<|im_start|>assistant
{Assistant}
"Flash Attention" function must be activated. why?