YC-Chen commited on
Commit
f436e2b
1 Parent(s): b625de9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -7,7 +7,15 @@ pipeline_tag: text-generation
7
  The Breeze-7B-Instruct-v0.1 is a 7-billion-parameter language model built from Mistral-7B and tailored for Traditional Chinese (TC).
8
  This model incorporates an additional 30k TC vocabularies to better adapt to TC and improve inference speed, resulting in a doubling of the original tokenizer's inference speed.
9
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
10
- This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on all TC benchmarks we tested, and is comparable with Mistral-7B-Instruct on the Open LLM Leaderboard.
 
 
 
 
 
 
 
 
11
 
12
  ## Model Details
13
  - **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)
 
7
  The Breeze-7B-Instruct-v0.1 is a 7-billion-parameter language model built from Mistral-7B and tailored for Traditional Chinese (TC).
8
  This model incorporates an additional 30k TC vocabularies to better adapt to TC and improve inference speed, resulting in a doubling of the original tokenizer's inference speed.
9
  Breeze-7B-Instruct-v0.1 performs well on both EN and TC benchmarks.
10
+ This model outperforms Taiwan-LLM-7B-v2.1-chat, Taiwan-LLM-13B-v2.0-chat, and Yi-6B-Chat on major TC benchmarks we tested, and is comparable with Mistral-7B-Instruct on the Open LLM Leaderboard.
11
+
12
+ ## Features
13
+
14
+ - Expanding the vocabulary dictionary for Traditional Chinese from 32k to 62k vocabulary size (the first successful work in Traditional Chinese)
15
+ - Multi-turn dialogue without special handling for harmful content
16
+ - 8k context length
17
+ - Grouped-query attention
18
+ - Sliding-window attention
19
 
20
  ## Model Details
21
  - **Finetuned from:** [MediaTek-Research/Breeze-7B-Base-v0.1](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v0.1)