Update README.md
Browse files
README.md
CHANGED
@@ -56,24 +56,35 @@ tags:
|
|
56 |
- biology
|
57 |
- medical
|
58 |
---
|
|
|
|
|
59 |
Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish, Arabic, Russian, Japanese, Korean, German, Italian, Portuguese and 38 Minor Languages So far.
|
|
|
60 |
|
61 |
|
62 |
|
63 |
<p align="center">
|
64 |
-
๐ <a href="https://arxiv.org/abs/2410.10626" target="_blank">Paper</a> โข ๐ <a href="" target="_blank">Demo</a> โข ๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEDataset" target="_blank">ApolloMoEDataset</a> โข ๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a> โข ๐ค <a href="https://huggingface.co/collections/FreedomIntelligence/apollomoe-and-apollo2-670ddebe3bb1ba1aebabbf2c" target="_blank">Models</a> โข ๐ <a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Apollo</a>
|
65 |
</p>
|
66 |
|
67 |
-
|
68 |
-
|
69 |
![Apollo](assets/apollo_medium_final.png)
|
70 |
|
71 |
-
|
72 |
## ๐ Update
|
73 |
|
74 |
* **[2024.10.15]** ApolloMoE repo is published๏ผ๐
|
75 |
|
76 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
77 |
## Architecture
|
78 |
|
79 |
<details>
|
@@ -87,29 +98,24 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
87 |
|
88 |
### Dense
|
89 |
๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
|
90 |
-
|
91 |
<details>
|
92 |
<summary>Click to view the Dense Models Results</summary>
|
93 |
-
|
94 |
![ApolloMoE](assets/dense_results.png)
|
95 |
|
96 |
</details>
|
97 |
|
98 |
### Post-MoE
|
99 |
๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
|
100 |
-
|
101 |
<details>
|
102 |
<summary>Click to view the Post-MoE Models Results</summary>
|
103 |
-
|
104 |
![ApolloMoE](assets/post_moe_results.png)
|
105 |
|
106 |
</details>
|
107 |
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
โ
|
112 |
-
|
113 |
|
114 |
## Usage Format
|
115 |
#### Apollo2
|
@@ -119,7 +125,7 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
119 |
|
120 |
#### Apollo-MoE
|
121 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
122 |
-
|
123 |
## Dataset & Evaluation
|
124 |
|
125 |
- Dataset
|
@@ -133,12 +139,12 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
133 |
|
134 |
|
135 |
</details>
|
136 |
-
|
137 |
- Evaluation
|
138 |
๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a>
|
139 |
|
140 |
<details><summary>Click to expand</summary>
|
141 |
-
|
142 |
- EN:
|
143 |
- [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
|
144 |
- [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
|
@@ -174,29 +180,28 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
174 |
- PT: [BioInstructQA](https://huggingface.co/datasets/BioMistral/BioInstructQA): Portuguese part
|
175 |
- RU: [RuMedBench](https://github.com/sb-ai-lab/MedBench)
|
176 |
|
177 |
-
|
178 |
-
|
179 |
-
โ
|
180 |
|
181 |
|
182 |
</details>
|
183 |
|
184 |
-
|
185 |
## Results reproduction
|
186 |
<details><summary>Click to expand</summary>
|
187 |
|
188 |
-
|
189 |
-
We take
|
190 |
1. Download Dataset for project:
|
191 |
|
192 |
```
|
193 |
-
bash 0.download_data.sh
|
194 |
```
|
195 |
|
196 |
-
2. Prepare test and dev for specific model:
|
197 |
|
198 |
|
199 |
-
- Create test data for with special token
|
200 |
|
201 |
```
|
202 |
bash 1.data_process_test&dev.sh
|
@@ -204,23 +209,21 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
204 |
|
205 |
3. Prepare train data for specific model (Create tokenized data in advance):
|
206 |
|
207 |
-
|
208 |
-
- You can adjust data Training order and Training Epoch in this step
|
209 |
|
|
|
|
|
210 |
```
|
211 |
bash 2.data_process_train.sh
|
212 |
```
|
213 |
-
|
214 |
4. Train the model
|
215 |
|
216 |
-
|
217 |
-
- If you want to train in Multi Nodes please refer to ./
|
218 |
-
|
219 |
-
|
220 |
|
221 |
|
222 |
```
|
223 |
-
bash 3.
|
224 |
```
|
225 |
|
226 |
|
@@ -230,12 +233,6 @@ Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish,
|
|
230 |
bash 4.eval.sh
|
231 |
```
|
232 |
|
233 |
-
6. Evaluate your model: Play with your ckpts in bash
|
234 |
-
|
235 |
-
```
|
236 |
-
python ./src/evaluate/cli_demo.py --model_name='./ckpts/your/path/tfmr'
|
237 |
-
```
|
238 |
-
|
239 |
</details>
|
240 |
|
241 |
|
|
|
56 |
- biology
|
57 |
- medical
|
58 |
---
|
59 |
+
# Democratizing Medical LLMs For Much More Languages
|
60 |
+
|
61 |
Covering 12 Major Languages including English, Chinese, French, Hindi, Spanish, Arabic, Russian, Japanese, Korean, German, Italian, Portuguese and 38 Minor Languages So far.
|
62 |
+
<center>
|
63 |
|
64 |
|
65 |
|
66 |
<p align="center">
|
67 |
+
๐ <a href="https://arxiv.org/abs/2410.10626" target="_blank">Paper</a> โข ๐ <a href="" target="_blank">Demo</a> โข ๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEDataset" target="_blank">ApolloMoEDataset</a> โข ๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a> โข ๐ค <a href="https://huggingface.co/collections/FreedomIntelligence/apollomoe-and-apollo2-670ddebe3bb1ba1aebabbf2c" target="_blank">Models</a> โข ๐ <a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Apollo</a>
|
68 |
</p>
|
69 |
|
|
|
|
|
70 |
![Apollo](assets/apollo_medium_final.png)
|
71 |
|
|
|
72 |
## ๐ Update
|
73 |
|
74 |
* **[2024.10.15]** ApolloMoE repo is published๏ผ๐
|
75 |
|
76 |
|
77 |
+
## Languages Coverage
|
78 |
+
12 Major Languages and 38 Minor Languages
|
79 |
+
|
80 |
+
<details>
|
81 |
+
<summary>Click to view the Languages Coverage</summary>
|
82 |
+
|
83 |
+
![ApolloMoE](assets/languages.png)
|
84 |
+
|
85 |
+
</details>
|
86 |
+
|
87 |
+
|
88 |
## Architecture
|
89 |
|
90 |
<details>
|
|
|
98 |
|
99 |
### Dense
|
100 |
๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-0.5B" target="_blank">Apollo2-0.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-1.5B" target="_blank">Apollo2-1.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-2B" target="_blank">Apollo2-2B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-3.8B" target="_blank">Apollo2-3.8B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-7B" target="_blank">Apollo2-7B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo2-9B" target="_blank">Apollo2-9B</a>
|
101 |
+
|
102 |
<details>
|
103 |
<summary>Click to view the Dense Models Results</summary>
|
104 |
+
|
105 |
![ApolloMoE](assets/dense_results.png)
|
106 |
|
107 |
</details>
|
108 |
|
109 |
### Post-MoE
|
110 |
๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-0.5B" target="_blank">Apollo-MoE-0.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-1.5B" target="_blank">Apollo-MoE-1.5B</a> โข ๐ค <a href="https://huggingface.co/FreedomIntelligence/Apollo-MoE-7B" target="_blank">Apollo-MoE-7B</a>
|
111 |
+
|
112 |
<details>
|
113 |
<summary>Click to view the Post-MoE Models Results</summary>
|
114 |
+
|
115 |
![ApolloMoE](assets/post_moe_results.png)
|
116 |
|
117 |
</details>
|
118 |
|
|
|
|
|
|
|
|
|
|
|
119 |
|
120 |
## Usage Format
|
121 |
#### Apollo2
|
|
|
125 |
|
126 |
#### Apollo-MoE
|
127 |
- 0.5B, 1.5B, 7B: User:{query}\nAssistant:{response}<|endoftext|>
|
128 |
+
|
129 |
## Dataset & Evaluation
|
130 |
|
131 |
- Dataset
|
|
|
139 |
|
140 |
|
141 |
</details>
|
142 |
+
|
143 |
- Evaluation
|
144 |
๐ค <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloMoEBench" target="_blank">ApolloMoEBench</a>
|
145 |
|
146 |
<details><summary>Click to expand</summary>
|
147 |
+
|
148 |
- EN:
|
149 |
- [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
|
150 |
- [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
|
|
|
180 |
- PT: [BioInstructQA](https://huggingface.co/datasets/BioMistral/BioInstructQA): Portuguese part
|
181 |
- RU: [RuMedBench](https://github.com/sb-ai-lab/MedBench)
|
182 |
|
183 |
+
|
184 |
+
|
|
|
185 |
|
186 |
|
187 |
</details>
|
188 |
|
189 |
+
|
190 |
## Results reproduction
|
191 |
<details><summary>Click to expand</summary>
|
192 |
|
193 |
+
|
194 |
+
We take Apollo2-7B or Apollo-MoE-0.5B as example
|
195 |
1. Download Dataset for project:
|
196 |
|
197 |
```
|
198 |
+
bash 0.download_data.shย
|
199 |
```
|
200 |
|
201 |
+
2. Prepare test and dev data for specific model:
|
202 |
|
203 |
|
204 |
+
- Create test data for with special token
|
205 |
|
206 |
```
|
207 |
bash 1.data_process_test&dev.sh
|
|
|
209 |
|
210 |
3. Prepare train data for specific model (Create tokenized data in advance):
|
211 |
|
|
|
|
|
212 |
|
213 |
+
- You can adjust data Training order and Training Epoch in this step
|
214 |
+
|
215 |
```
|
216 |
bash 2.data_process_train.sh
|
217 |
```
|
218 |
+
|
219 |
4. Train the model
|
220 |
|
221 |
+
|
222 |
+
- If you want to train in Multi Nodes please refer to ./src/sft/training_config/zero_multi.yaml
|
|
|
|
|
223 |
|
224 |
|
225 |
```
|
226 |
+
bash 3.single_node_train.sh
|
227 |
```
|
228 |
|
229 |
|
|
|
233 |
bash 4.eval.sh
|
234 |
```
|
235 |
|
|
|
|
|
|
|
|
|
|
|
|
|
236 |
</details>
|
237 |
|
238 |
|