myeongho-jeong
commited on
Commit
•
e8b012e
1
Parent(s):
7fd7639
Update README.md
Browse files
README.md
CHANGED
@@ -27,7 +27,9 @@ If you're passionate about the field of Large Language Models and wish to exchan
|
|
27 |
This model is a Korean vocabulary-extended version of [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0), specifically fine-tuned on various Korean web-crawled datasets available on HuggingFace. Our approach was to expand the model's understanding of Korean by pre-training the embeddings for new tokens and partially fine-tuning the `lm_head` embeddings for the already existing tokens while preserving the original parameters of the base model.
|
28 |
|
29 |
### Technical Deep Dive
|
30 |
-
|
|
|
|
|
31 |
Here’s a glimpse into our technical approach:
|
32 |
|
33 |
```python
|
|
|
27 |
This model is a Korean vocabulary-extended version of [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0), specifically fine-tuned on various Korean web-crawled datasets available on HuggingFace. Our approach was to expand the model's understanding of Korean by pre-training the embeddings for new tokens and partially fine-tuning the `lm_head` embeddings for the already existing tokens while preserving the original parameters of the base model.
|
28 |
|
29 |
### Technical Deep Dive
|
30 |
+
<p align="left">
|
31 |
+
<img src="https://huggingface.co/yanolja/EEVE-Korean-10.8B-v1.0/blob/main/EEVE_figure.png" width="100%"/>
|
32 |
+
<p>
|
33 |
Here’s a glimpse into our technical approach:
|
34 |
|
35 |
```python
|