Multi-langua?
#7
by
oFDz
- opened
How hard/time consuming would it be to fine-tune this one to be able to handle a language like Arabic?
How hard/time consuming would it be to fine-tune this one to be able to handle a language like Arabic?
please take a look at tinyllama https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs ππ. The training has started on 2023-09-01.
of course, if you have enough power, it would be fast.