jomangbp commited on
Commit
24c7278
1 Parent(s): 51a0cbd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -13,11 +13,15 @@ base_model:
13
  - Open-Orca/Mistral-7B-OpenOrca
14
  ---
15
 
16
- # seldonium-2x3b-MoE-v0.1
17
 
18
- seldonium-2x3b-MoE-v0.1 is a Mixture of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
19
- * [Locutusque/Hercules-4.0-Mistral-v0.2-7B](https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B)
20
- * [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
 
 
 
 
21
 
22
  ## 🧩 Configuration
23
 
 
13
  - Open-Orca/Mistral-7B-OpenOrca
14
  ---
15
 
16
+ # seldonium-2x7b-MoE-v0.1
17
 
18
+ seldonium-2x7b-MoE-v0.1-coder-logic is a Mixture of Experts (MoE) model that combines the capabilities of two specialized language models:
19
+
20
+ [Locutusque/Hercules-4.0-Mistral-v0.2-7B] (https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B?not-for-all-audiences=true): A 7B parameter model focused on programming tasks, such as writing functions, implementing algorithms, and working with data structures.
21
+
22
+ [Open-Orca/Mistral-7B-OpenOrca] (https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca): A 7B parameter model focused on logical reasoning and analysis, including solving logic problems, evaluating arguments, and assessing the validity of statements.
23
+
24
+ This MoE model was created using the [LazyMergekit] (https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing) colab, which allows for efficient combination of specialized models to produce a more capable and efficient overall model. The seldonium-2x3b-MoE-v0.1 can be used for a variety of natural language processing tasks that benefit from the complementary strengths of its expert components."
25
 
26
  ## 🧩 Configuration
27