Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,8 @@
|
|
1 |
---
|
2 |
license: mit
|
|
|
|
|
|
|
3 |
---
|
4 |
|
5 |
<h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
|
@@ -8,6 +11,7 @@ license: mit
|
|
8 |
|
9 |
## π° News
|
10 |
|
|
|
11 |
- **[2024.05.25]** π₯ Our **checkpoints** are available now!
|
12 |
- **[2024.05.23]** π₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
|
13 |
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
pipeline_tag: image-text-to-text
|
4 |
+
tags:
|
5 |
+
- text-generation-inference
|
6 |
---
|
7 |
|
8 |
<h2 align="center"> <a href="https://arxiv.org/abs/2405.14297">Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models</a></h2>
|
|
|
11 |
|
12 |
## π° News
|
13 |
|
14 |
+
- **[2024.5.31]** π₯ Our [code](https://github.com/LINs-lab/DynMoE/) is released!
|
15 |
- **[2024.05.25]** π₯ Our **checkpoints** are available now!
|
16 |
- **[2024.05.23]** π₯ Our [paper](https://arxiv.org/abs/2405.14297) is released!
|
17 |
|