Update README.md
Browse files
README.md
CHANGED
@@ -24,21 +24,22 @@ This model should cover multiple different disciplines and behaviors well as I t
|
|
24 |
|
25 |
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
|
26 |
|
27 |
-
- **Developed by:** [
|
28 |
-
- **Funded by [
|
29 |
-
- **
|
30 |
-
- **
|
31 |
-
- **
|
32 |
-
- **
|
33 |
-
- **Finetuned from model [optional]:** [More Information Needed]
|
34 |
|
35 |
-
### Model Sources
|
36 |
|
37 |
-
|
|
|
|
|
|
|
|
|
|
|
38 |
|
39 |
-
- **Repository:** [More Information Needed]
|
40 |
-
- **Paper [optional]:** [More Information Needed]
|
41 |
-
- **Demo [optional]:** [More Information Needed]
|
42 |
|
43 |
## Uses
|
44 |
|
|
|
24 |
|
25 |
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
|
26 |
|
27 |
+
- **Developed by:** [ibivibiv](https://huggingface.co/ibivibiv)
|
28 |
+
- **Funded by:** [ibivibiv](https://huggingface.co/ibivibiv) <-- right out of my poor pocket lol
|
29 |
+
- **Model type:** MOE
|
30 |
+
- **Language(s) (NLP):** English
|
31 |
+
- **License:** Apache 2
|
32 |
+
- **Finetuned from model:** see model sources below for list of models used in the MOE
|
|
|
33 |
|
34 |
+
### Model Sources
|
35 |
|
36 |
+
I use the following 4 models to create an MOE that should cover multiple disciplines and do it well. I will most likely (if I can afford to do it), try this out and if I find that it is working I will make another variation.
|
37 |
+
|
38 |
+
[Marcoroni-70B-v1](https://huggingface.co/AIDC-ai-business/Marcoroni-70B-v1)
|
39 |
+
[Aurora-Nights-70B-v1.0](https://huggingface.co/sophosympatheia/Aurora-Nights-70B-v1.0)
|
40 |
+
[strix-rufipes-70b](https://huggingface.co/ibivibiv/strix-rufipes-70b) <-- this one is mine :) I'm a bit proud, sorry.
|
41 |
+
[ICBU-NPU/FashionGPT-70B-V1.1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.1)
|
42 |
|
|
|
|
|
|
|
43 |
|
44 |
## Uses
|
45 |
|