---
base_model: unsloth/Qwen2-1.5B-Instruct-bnb-4bit
language:
- jv
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
---
Document Title
Open models for indigenous Indonesian languages
Bakpia is a family of open language models capable of responding in Javanese langauge. Version one of Bakpia is the first generative Javanese LLM gain functional instruction performance using solely synthetic data.
Beta preview
Bakpia V1 is a fine-tuned version of Qwen 2 1.5B Instruct. It is fine-tuned using massive synthetic data for Krama Javanese, where the prompts are generated by GPT-4o and the responses are generated by Claude 3 Haiku.
## Version 1.0
This is the first version of Bakpia.
✨ Training
- 36K input-output pairs
- 64/128 lora r/alpha
- Rank-stabilized lora
✨ Features
- Single-turn QA across various domains.
- Ngoko Javanese not currently supported.
## Uploaded model
- **Developed by:** Afrizal Hasbi Azizy
- **License:** apache-2.0