Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ license: apache-2.0
|
|
12 |
|
13 |
## Key Messages
|
14 |
|
15 |
-
1. JetMoE-8B is **trained with less than $ 0.1 million**<sup>1</sup> **cost but outperforms LLaMA2-7B from Meta AI**, who has multi-billion-dollar training resources. LLM training can be **much cheaper than people
|
16 |
|
17 |
2. JetMoE-8B is **fully open-sourced and academia-friendly** because:
|
18 |
- It **only uses public datasets** for training, and the code is open-sourced. No proprietary resource is needed.
|
|
|
12 |
|
13 |
## Key Messages
|
14 |
|
15 |
+
1. JetMoE-8B is **trained with less than $ 0.1 million**<sup>1</sup> **cost but outperforms LLaMA2-7B from Meta AI**, who has multi-billion-dollar training resources. LLM training can be **much cheaper than people previously thought**.
|
16 |
|
17 |
2. JetMoE-8B is **fully open-sourced and academia-friendly** because:
|
18 |
- It **only uses public datasets** for training, and the code is open-sourced. No proprietary resource is needed.
|