|
--- |
|
language: |
|
- en |
|
- ko |
|
license: llama3 |
|
library_name: transformers |
|
base_model: |
|
- meta-llama/Meta-Llama-3-8B |
|
--- |
|
|
|
<a href="https://github.com/MLP-Lab/Bllossom"> |
|
<img src="https://github.com/teddysum/bllossom/blob/main//bllossom_icon.png?raw=true" width="40%" height="50%"> |
|
</a> |
|
|
|
# Bllossom | [Demo]() | [Homepage](https://www.bllossom.ai/) | [Github](https://github.com/MLP-Lab/Bllossom) | |
|
|
|
[GPU์ฉ Colab ์ฝ๋์์ ](https://colab.research.google.com/drive/1fBOzUVZ6NRKk_ugeoTbAOokWKqSN47IG?usp=sharing) | |
|
[CPU์ฉ Colab ์์ํ๋ชจ๋ธ ์ฝ๋์์ ](https://colab.research.google.com/drive/129ZNVg5R2NPghUEFHKF0BRdxsZxinQcJ?usp=drive_link) |
|
|
|
```bash |
|
์ ํฌ Bllossomํ ์์ ํ๊ตญ์ด-์์ด ์ด์ค ์ธ์ด๋ชจ๋ธ์ธ Bllossom์ ๊ณต๊ฐํ์ต๋๋ค! |
|
์์ธ๊ณผ๊ธฐ๋ ์ํผ์ปดํจํ
์ผํฐ์ ์ง์์ผ๋ก 100GB๊ฐ๋๋ ํ๊ตญ์ด๋ก ๋ชจ๋ธ์ ์ฒด๋ฅผ ํํ๋ํ ํ๊ตญ์ด ๊ฐํ ์ด์ค์ธ์ด ๋ชจ๋ธ์
๋๋ค! |
|
ํ๊ตญ์ด ์ํ๋ ๋ชจ๋ธ ์ฐพ๊ณ ์์ง ์์ผ์
จ๋์? |
|
- ํ๊ตญ์ด ์ต์ด! ๋ฌด๋ ค 3๋ง๊ฐ๊ฐ ๋๋ ํ๊ตญ์ด ์ดํํ์ฅ |
|
- Llama3๋๋น ๋๋ต 25% ๋ ๊ธด ๊ธธ์ด์ ํ๊ตญ์ด Context ์ฒ๋ฆฌ๊ฐ๋ฅ |
|
- ํ๊ตญ์ด-์์ด Pararell Corpus๋ฅผ ํ์ฉํ ํ๊ตญ์ด-์์ด ์ง์์ฐ๊ฒฐ (์ฌ์ ํ์ต) |
|
- ํ๊ตญ์ด ๋ฌธํ, ์ธ์ด๋ฅผ ๊ณ ๋ คํด ์ธ์ดํ์๊ฐ ์ ์ํ ๋ฐ์ดํฐ๋ฅผ ํ์ฉํ ๋ฏธ์ธ์กฐ์ |
|
- ๊ฐํํ์ต |
|
์ด ๋ชจ๋ ๊ฒ ํ๊บผ๋ฒ์ ์ ์ฉ๋๊ณ ์์
์ ์ด์ฉ์ด ๊ฐ๋ฅํ Bllossom์ ์ด์ฉํด ์ฌ๋ฌ๋ถ ๋ง์ ๋ชจ๋ธ์ ๋ง๋ค์ด๋ณด์ธ์ฅ! |
|
๋ฌด๋ ค Colab ๋ฌด๋ฃ GPU๋ก ํ์ต์ด ๊ฐ๋ฅํฉ๋๋ค. ํน์ ์์ํ ๋ชจ๋ธ๋ก CPU์์ฌ๋ ค๋ณด์ธ์ [์์ํ๋ชจ๋ธ](https://huggingface.co/MLP-KTLim/llama-3-Korean-Bllossom-8B-4bit) |
|
|
|
1. Bllossom-8B๋ ์์ธ๊ณผ๊ธฐ๋, ํ
๋์ธ, ์ฐ์ธ๋ ์ธ์ด์์ ์ฐ๊ตฌ์ค์ ์ธ์ดํ์์ ํ์
ํด ๋ง๋ ์ค์ฉ์ฃผ์๊ธฐ๋ฐ ์ธ์ด๋ชจ๋ธ์
๋๋ค! ์์ผ๋ก ์ง์์ ์ธ ์
๋ฐ์ดํธ๋ฅผ ํตํด ๊ด๋ฆฌํ๊ฒ ์ต๋๋ค ๋ง์ด ํ์ฉํด์ฃผ์ธ์ ๐ |
|
2. ์ด ๊ฐ๋ ฅํ Advanced-Bllossom 8B, 70B๋ชจ๋ธ, ์๊ฐ-์ธ์ด๋ชจ๋ธ์ ๋ณด์ ํ๊ณ ์์ต๋๋ค! (๊ถ๊ธํ์ ๋ถ์ ๊ฐ๋ณ ์ฐ๋ฝ์ฃผ์ธ์!!) |
|
3. Bllossom์ NAACL2024, LREC-COLING2024 (๊ตฌ๋) ๋ฐํ๋ก ์ฑํ๋์์ต๋๋ค. |
|
4. ์ข์ ์ธ์ด๋ชจ๋ธ ๊ณ์ ์
๋ฐ์ดํธ ํ๊ฒ ์ต๋๋ค!! ํ๊ตญ์ด ๊ฐํ๋ฅผ์ํด ๊ณต๋ ์ฐ๊ตฌํ์ค๋ถ(ํนํ๋
ผ๋ฌธ) ์ธ์ ๋ ํ์ํฉ๋๋ค!! |
|
ํนํ ์๋์ GPU๋ผ๋ ๋์ฌ ๊ฐ๋ฅํํ์ ์ธ์ ๋ ์ฐ๋ฝ์ฃผ์ธ์! ๋ง๋ค๊ณ ์ถ์๊ฑฐ ๋์๋๋ ค์. |
|
``` |
|
|
|
The Bllossom language model is a Korean-English bilingual language model based on the open-source LLama3. It enhances the connection of knowledge between Korean and English. It has the following features: |
|
|
|
* **Knowledge Linking**: Linking Korean and English knowledge through additional training |
|
* **Vocabulary Expansion**: Expansion of Korean vocabulary to enhance Korean expressiveness. |
|
* **Instruction Tuning**: Tuning using custom-made instruction following data specialized for Korean language and Korean culture |
|
* **Human Feedback**: DPO has been applied |
|
* **Vision-Language Alignment**: Aligning the vision transformer with this language model |
|
|
|
**This model developed by [MLPLab at Seoultech](http://mlp.seoultech.ac.kr), [Teddysum](http://teddysum.ai/) and [Yonsei Univ](https://sites.google.com/view/hansaemkim/hansaem-kim)** |
|
|
|
## Demo Video |
|
|
|
<div style="display: flex; justify-content: space-between;"> |
|
<!-- ์ฒซ ๋ฒ์งธ ์ปฌ๋ผ --> |
|
<div style="width: 49%;"> |
|
<a> |
|
<img src="https://github.com/lhsstn/lhsstn/blob/main/x-llava_dem.gif?raw=true" style="width: 100%; height: auto;"> |
|
</a> |
|
<p style="text-align: center;">Bllossom-V Demo</p> |
|
</div> |
|
|
|
<!-- ๋ ๋ฒ์งธ ์ปฌ๋ผ (ํ์ํ๋ค๋ฉด) --> |
|
<div style="width: 49%;"> |
|
<a> |
|
<img src="https://github.com/lhsstn/lhsstn/blob/main/bllossom_demo_kakao.gif?raw=true" style="width: 70%; height: auto;"> |
|
</a> |
|
<p style="text-align: center;">Bllossom Demo(Kakao)ใ
คใ
คใ
คใ
คใ
คใ
คใ
คใ
ค</p> |
|
</div> |
|
</div> |
|
|
|
|
|
|
|
## NEWS |
|
* [2024.05.08] Vocab Expansion Model Update |
|
* [2024.04.25] We released Bllossom v2.0, based on llama-3 |
|
* [2023/12] We released Bllossom-Vision v1.0, based on Bllossom |
|
* [2023/08] We released Bllossom v1.0, based on llama-2. |
|
* [2023/07] We released Bllossom v0.7, based on polyglot-ko. |
|
|
|
|
|
## Example code |
|
|
|
### Colab Tutorial |
|
- [Inference-Code-Link](https://colab.research.google.com/drive/1fBOzUVZ6NRKk_ugeoTbAOokWKqSN47IG?usp=sharing) |
|
|
|
### Install Dependencies |
|
```bash |
|
pip install torch transformers==4.40.0 accelerate |
|
``` |
|
|
|
### Python code with Pipeline |
|
```python |
|
import transformers |
|
import torch |
|
|
|
model_id = "MLP-KTLim/llama-3-Korean-Bllossom-8B" |
|
|
|
pipeline = transformers.pipeline( |
|
"text-generation", |
|
model=model_id, |
|
model_kwargs={"torch_dtype": torch.bfloat16}, |
|
device_map="auto", |
|
) |
|
|
|
pipeline.model.eval() |
|
|
|
PROMPT = '''You are a helpful AI assistant. Please answer the user's questions kindly. ๋น์ ์ ์ ๋ฅํ AI ์ด์์คํดํธ ์
๋๋ค. ์ฌ์ฉ์์ ์ง๋ฌธ์ ๋ํด ์น์ ํ๊ฒ ๋ต๋ณํด์ฃผ์ธ์.''' |
|
instruction = "์์ธ๊ณผํ๊ธฐ์ ๋ํ๊ต MLP์ฐ๊ตฌ์ค์ ๋ํด ์๊ฐํด์ค" |
|
|
|
messages = [ |
|
{"role": "system", "content": f"{PROMPT}"}, |
|
{"role": "user", "content": f"{instruction}"} |
|
] |
|
|
|
prompt = pipeline.tokenizer.apply_chat_template( |
|
messages, |
|
tokenize=False, |
|
add_generation_prompt=True |
|
) |
|
|
|
terminators = [ |
|
pipeline.tokenizer.eos_token_id, |
|
pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") |
|
] |
|
|
|
outputs = pipeline( |
|
prompt, |
|
max_new_tokens=2048, |
|
eos_token_id=terminators, |
|
do_sample=True, |
|
temperature=0.6, |
|
top_p=0.9 |
|
) |
|
|
|
print(outputs[0]["generated_text"][len(prompt):]) |
|
|
|
# ์์ธ๊ณผํ๊ธฐ์ ๋ํ๊ต MLP์ฐ๊ตฌ์ค์ ๋ฉํฐ๋ชจ๋ฌ ์์ฐ์ด์ฒ๋ฆฌ ์ฐ๊ตฌ๋ฅผ ํ๊ณ ์์ต๋๋ค. ๊ตฌ์ฑ์์ ์๊ฒฝํ ๊ต์์ ๊น๋ฏผ์ค, ๊น์๋ฏผ, ์ต์ฐฝ์, ์์ธํธ, ์ ํ๊ฒฐ, ์ํ์, ์ก์น์ฐ, ์ก์ ํ, ์ ๋์ฌ ํ์์ด ์์ต๋๋ค. |
|
``` |
|
|
|
### Python code with AutoModel |
|
```python |
|
|
|
import os |
|
import torch |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
model_id = 'MLP-KTLim/llama-3-Korean-Bllossom-8B' |
|
|
|
tokenizer = AutoTokenizer.from_pretrained(model_id) |
|
model = AutoModelForCausalLM.from_pretrained( |
|
model_id, |
|
torch_dtype=torch.bfloat16, |
|
device_map="auto", |
|
) |
|
|
|
model.eval() |
|
|
|
PROMPT = '''You are a helpful AI assistant. Please answer the user's questions kindly. ๋น์ ์ ์ ๋ฅํ AI ์ด์์คํดํธ ์
๋๋ค. ์ฌ์ฉ์์ ์ง๋ฌธ์ ๋ํด ์น์ ํ๊ฒ ๋ต๋ณํด์ฃผ์ธ์.''' |
|
instruction = "์์ธ๊ณผํ๊ธฐ์ ๋ํ๊ต MLP์ฐ๊ตฌ์ค์ ๋ํด ์๊ฐํด์ค" |
|
|
|
messages = [ |
|
{"role": "system", "content": f"{PROMPT}"}, |
|
{"role": "user", "content": f"{instruction}"} |
|
] |
|
|
|
input_ids = tokenizer.apply_chat_template( |
|
messages, |
|
add_generation_prompt=True, |
|
return_tensors="pt" |
|
).to(model.device) |
|
|
|
terminators = [ |
|
tokenizer.eos_token_id, |
|
tokenizer.convert_tokens_to_ids("<|eot_id|>") |
|
] |
|
|
|
outputs = model.generate( |
|
input_ids, |
|
max_new_tokens=2048, |
|
eos_token_id=terminators, |
|
do_sample=True, |
|
temperature=0.6, |
|
top_p=0.9 |
|
) |
|
|
|
print(tokenizer.decode(outputs[0][input_ids.shape[-1]:], skip_special_tokens=True)) |
|
# ์์ธ๊ณผํ๊ธฐ์ ๋ํ๊ต MLP์ฐ๊ตฌ์ค์ ๋ฉํฐ๋ชจ๋ฌ ์์ฐ์ด์ฒ๋ฆฌ ์ฐ๊ตฌ๋ฅผ ํ๊ณ ์์ต๋๋ค. ๊ตฌ์ฑ์์ ์๊ฒฝํ ๊ต์์ ๊น๋ฏผ์ค, ๊น์๋ฏผ, ์ต์ฐฝ์, ์์ธํธ, ์ ํ๊ฒฐ, ์ํ์, ์ก์น์ฐ, ์ก์ ํ, ์ ๋์ฌ ํ์์ด ์์ต๋๋ค. |
|
``` |
|
|
|
|
|
|
|
## Citation |
|
**Language Model** |
|
```text |
|
@misc{bllossom, |
|
author = {ChangSu Choi, Yongbin Jeong, Seoyoon Park, InHo Won, HyeonSeok Lim, SangMin Kim, Yejee Kang, Chanhyuk Yoon, Jaewan Park, Yiseul Lee, HyeJin Lee, Younggyun Hahm, Hansaem Kim, KyungTae Lim}, |
|
title = {Optimizing Language Augmentation for Multilingual Large Language Models: A Case Study on Korean}, |
|
year = {2024}, |
|
journal = {LREC-COLING 2024}, |
|
paperLink = {\url{https://arxiv.org/pdf/2403.10882}}, |
|
}, |
|
} |
|
``` |
|
|
|
**Vision-Language Model** |
|
```text |
|
@misc{bllossom-V, |
|
author = {Dongjae Shin, Hyunseok Lim, Inho Won, Changsu Choi, Minjun Kim, Seungwoo Song, Hangyeol Yoo, Sangmin Kim, Kyungtae Lim}, |
|
title = {X-LLaVA: Optimizing Bilingual Large Vision-Language Alignment}, |
|
year = {2024}, |
|
publisher = {GitHub}, |
|
journal = {NAACL 2024 findings}, |
|
paperLink = {\url{https://arxiv.org/pdf/2403.11399}}, |
|
}, |
|
} |
|
``` |
|
|
|
## Contact |
|
- ์๊ฒฝํ(KyungTae Lim), Professor at Seoultech. `[email protected]` |
|
- ํจ์๊ท (Younggyun Hahm), CEO of Teddysum. `[email protected]` |
|
- ๊นํ์(Hansaem Kim), Professor at Yonsei. `[email protected]` |
|
|
|
## Contributor |
|
- ์ต์ฐฝ์(Chansu Choi), [email protected] |
|
- ๊น์๋ฏผ(Sangmin Kim), [email protected] |
|
- ์์ธํธ(Inho Won), [email protected] |
|
- ๊น๋ฏผ์ค(Minjun Kim), [email protected] |
|
- ์ก์น์ฐ(Seungwoo Song), [email protected] |
|
- ์ ๋์ฌ(Dongjae Shin), [email protected] |
|
- ์ํ์(Hyeonseok Lim), [email protected] |
|
- ์ก์ ํ(Jeonghun Yuk), [email protected] |
|
- ์ ํ๊ฒฐ(Hangyeol Yoo), [email protected] |
|
- ์ก์ํ(Seohyun Song), [email protected] |