Afrizal Hasbi Azizy
commited on
Commit
•
ffa6522
1
Parent(s):
eb078da
Update README.md
Browse files
README.md
CHANGED
@@ -11,14 +11,32 @@ language:
|
|
11 |
- id
|
12 |
inference: false
|
13 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
<center>
|
15 |
<img src="https://imgur.com/9nG5J1T.png" alt="Kancil" width="600" height="300">
|
16 |
<p><em>Kancil is a fine-tuned version of Llama 3 8B using synthetic QA dataset generated with Llama 3 70B. Version zero of Kancil is the first generative Indonesian LLM gain functional instruction performance using solely synthetic data.</em></p>
|
17 |
<p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
|
18 |
</center>
|
19 |
|
20 |
-
#### Introducing the Kancil family of open models
|
21 |
-
|
22 |
Selamat datang!
|
23 |
|
24 |
I am ultra-overjoyed to introduce you... the 🦌 Kancil! It's a fine-tuned version of Llama 3 8B with the TumpengQA, an instruction dataset of 6.7 million words. Both the model and dataset is openly available in Huggingface.
|
|
|
11 |
- id
|
12 |
inference: false
|
13 |
---
|
14 |
+
<!DOCTYPE html>
|
15 |
+
<html lang="en">
|
16 |
+
<head>
|
17 |
+
<meta charset="UTF-8">
|
18 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
19 |
+
<title>Document Title</title>
|
20 |
+
<style>
|
21 |
+
h1 {
|
22 |
+
font-size: 36px;
|
23 |
+
color: navy;
|
24 |
+
font-family: 'Tahoma';
|
25 |
+
text-align: center;
|
26 |
+
}
|
27 |
+
</style>
|
28 |
+
</head>
|
29 |
+
<body>
|
30 |
+
<h1>Introducing the Kancil family of open models</h1>
|
31 |
+
</body>
|
32 |
+
</html>
|
33 |
+
|
34 |
<center>
|
35 |
<img src="https://imgur.com/9nG5J1T.png" alt="Kancil" width="600" height="300">
|
36 |
<p><em>Kancil is a fine-tuned version of Llama 3 8B using synthetic QA dataset generated with Llama 3 70B. Version zero of Kancil is the first generative Indonesian LLM gain functional instruction performance using solely synthetic data.</em></p>
|
37 |
<p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
|
38 |
</center>
|
39 |
|
|
|
|
|
40 |
Selamat datang!
|
41 |
|
42 |
I am ultra-overjoyed to introduce you... the 🦌 Kancil! It's a fine-tuned version of Llama 3 8B with the TumpengQA, an instruction dataset of 6.7 million words. Both the model and dataset is openly available in Huggingface.
|