license: apache-2.0 | |
## Introduce | |
Quantizing the [UnicomLLM/Unichat-llama3-Chinese-8B](https://huggingface.co/UnicomLLM/Unichat-llama3-Chinese-8B) to f16, q2, q3, q4, q5, q6 and q8 with Llama.cpp. | |
## Prompt template | |
``` | |
{system_message} | |
Human: {prompt} | |
Assistant: | |
``` |