Taiwan-ELM-270M / README.md
liswei's picture
Update README.md
97f00ee verified
|
raw
history blame
No virus
787 Bytes
---
library_name: transformers
license: apache-2.0
datasets:
- bigscience-data/roots_zh-tw_wikipedia
- bigscience-data/roots_en_wikipedia
language:
- zh
---
# Model Card for Chinese-OpenELM-270M
Finetuned from [apple/eOpenELM-270M](https://huggingface.co/apple/OpenELM-270M):
* Extended tokenizer with ~30K Chinese vocabs trained on [bigscience-data/roots_zh-tw_wikipedia](https://huggingface.co/datasets/bigscience-data/roots_zh-tw_wikipedia).
* Continual pre-trained with a mix of [bigscience-data/roots_zh-tw_wikipedia](https://huggingface.co/datasets/bigscience-data/roots_zh-tw_wikipedia) and [bigscience-data/roots_en_wikipedia](https://huggingface.co/datasets/bigscience-data/roots_en_wikipedia).
* Evaluation ppl = 1.6644828403646825 (split 3% training data as evaluation set)