jomangbp commited on
Commit
050395c
1 Parent(s): 24c7278

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -17,11 +17,12 @@ base_model:
17
 
18
  seldonium-2x7b-MoE-v0.1-coder-logic is a Mixture of Experts (MoE) model that combines the capabilities of two specialized language models:
19
 
20
- [Locutusque/Hercules-4.0-Mistral-v0.2-7B] (https://huggingface.co/Locutusque/Hercules-4.0-Mistral-v0.2-7B?not-for-all-audiences=true): A 7B parameter model focused on programming tasks, such as writing functions, implementing algorithms, and working with data structures.
21
 
22
- [Open-Orca/Mistral-7B-OpenOrca] (https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca): A 7B parameter model focused on logical reasoning and analysis, including solving logic problems, evaluating arguments, and assessing the validity of statements.
23
 
24
- This MoE model was created using the [LazyMergekit] (https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing) colab, which allows for efficient combination of specialized models to produce a more capable and efficient overall model. The seldonium-2x3b-MoE-v0.1 can be used for a variety of natural language processing tasks that benefit from the complementary strengths of its expert components."
 
25
 
26
  ## 🧩 Configuration
27
 
 
17
 
18
  seldonium-2x7b-MoE-v0.1-coder-logic is a Mixture of Experts (MoE) model that combines the capabilities of two specialized language models:
19
 
20
+ Locutusque/Hercules-4.0-Mistral-v0.2-7B: A 7B parameter model focused on programming tasks, such as writing functions, implementing algorithms, and working with data structures.
21
 
22
+ Open-Orca/Mistral-7B-OpenOrca: A 7B parameter model focused on logical reasoning and analysis, including solving logic problems, evaluating arguments, and assessing the validity of statements.
23
 
24
+ This MoE model was created using the LazyMergekit colab, which allows for efficient combination of specialized models to produce a more capable and efficient overall model.
25
+ The seldonium-2x3b-MoE-v0.1 can be used for a variety of natural language processing tasks that benefit from the complementary strengths of its expert components.
26
 
27
  ## 🧩 Configuration
28