File size: 1,559 Bytes
cf45560
7d6c3a1
 
 
 
cf45560
7d6c3a1
 
cf45560
7d6c3a1
0e88053
 
cf45560
f5efc68
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cf45560
f5efc68
 
 
 
 
 
cf45560
7d6c3a1
cf45560
7d6c3a1
cf45560
7d6c3a1
 
ded43e3
7d6c3a1
cf45560
7d6c3a1
 
 
cf45560
7d6c3a1
cf45560
7d6c3a1
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
base_model: unsloth/Qwen2-1.5B-Instruct-bnb-4bit
language:
- jv
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
---
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Document Title</title>
    <style>
        h1 {
            font-size: 36px;
            color: navy;
            font-family: 'Tahoma';
            text-align: center;
        }
    </style>
</head>
<body>
    <h1> Open models for indigenous Indonesian languages</h1>
</body>
</html>

<center>
    <img src="https://imgur.com/R91sZas.png" alt="Bakpia" width="600" height="300">
    <p><em>Bakpia is a family of open language models capable of responding in Javanese langauge. Version one of Bakpia is the first generative Javanese LLM gain functional instruction performance using solely synthetic data.</em></p>
    <p><em style="color: black; font-weight: bold;">Beta preview</em></p>
</center>
Bakpia V1 is a fine-tuned version of Qwen 2 1.5B Instruct. It is fine-tuned using massive synthetic data for Krama Javanese, where the prompts are generated by GPT-4o and the responses are generated by Claude 3 Haiku.

# Version 1.0

This is the first version of Bakpia.

✨ Training
- 36K input-output pairs
- 64/128 lora r/alpha
- Rank-stabilized lora

✨ Features
- Single-turn QA across various domains.
- Ngoko Javanese not currently supported.

# Uploaded  model

- **Developed by:** Afrizal Hasbi Azizy
- **License:** apache-2.0