metadata
language:
- en
- zh
license: llama3
library_name: transformers
base_model: unsloth/llama-3-8b-bnb-4bit
datasets:
- erhwenkuo/alpaca-data-gpt4-chinese-zhtw
pipeline_tag: text-generation
tags:
- llama-3
prompt_template: >-
{{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System
}}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
LLAMA 3 8B with capable to output Traditional Chinese
✨ Recommend using LMStudio for this model
I tried using Ollama to run it, but it became quite delulu.
So for now, I'm sticking with LMStudio :)The performance isn't actually that great, but it's capable of answering some basic questions. Sometimes it just acts really dumb though :(
LLAMA 3.1 can actually output pretty well Chinese, so this repo can be ignored.