FLAMA-0.1-3B-GGUF / README.md
xaviviro's picture
Create README.md
2c049e7
|
raw
history blame
No virus
1.85 kB
---
license: apache-2.0
base_model: xaviviro/FLAMA-0.1-3B
language:
- ca
- en
model_creator: xaviviro
model_name: FLAMA-0.1-3B
prompt_template: '<|im_start|>user\n{instruction}<|im_end|>\n<|im_start|>assistant\n'
quantized_by: xaviviro
---
# FLAMA: Model 3B ChatML en Catal脿. Versi贸 0.1
FLAMA 茅s el primer model petit 3B en catal脿. 脡s el resultat de finetunejar el model [open_llama_3b_v2](/openlm-research/open_llama_3b_v2) amb les instruccions d'[OpenAssistant](/xaviviro/oasst1_ca) tradu茂des autom脿ticament al catal脿 amb recursos de [Helsinki-NLP](/Helsinki-NLP) i tractades en format ChatML.
# Prompt Template
FLAMA usa ChatML com a prompt template:
```
<|im_start|>user
Qui va ser Isaac Newton?<|im_end|>
<|im_start|>assistant\n
```
## Refer猫ncies
```
@software{xaviviro2023flama,
author = {xaviviro},
title = {FLAMA: Model 3B ChatML en Catal脿. Versi贸 0.1},
month = December,
year = 2023,
url = {https://huggingface.co/xaviviro/FLAMA-0.1-3B}
}
```
```
@software{openlm2023openllama,
author = {Geng, Xinyang and Liu, Hao},
title = {OpenLLaMA: An Open Reproduction of LLaMA},
month = May,
year = 2023,
url = {https://github.com/openlm-research/open_llama}
}
```
```
@software{together2023redpajama,
author = {Together Computer},
title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
month = April,
year = 2023,
url = {https://github.com/togethercomputer/RedPajama-Data}
}
```
```
@article{touvron2023llama,
title={Llama: Open and efficient foundation language models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
```