|
--- |
|
language: |
|
- en |
|
license: apache-2.0 |
|
library_name: transformers |
|
tags: |
|
- 4-bit |
|
- AWQ |
|
- text-generation |
|
- autotrain_compatible |
|
- endpoints_compatible |
|
- text-generation-inference |
|
- transformers |
|
- unsloth |
|
- mistral |
|
- trl |
|
- code |
|
- 'medical ' |
|
- farmer |
|
- doctor |
|
- Mega-Series |
|
- Cyber-Series |
|
- Role-Play |
|
- Self-Rag |
|
- ThinkingBot |
|
- milestone |
|
- mega-series |
|
- SpydazWebAI |
|
base_model: LeroyDyer/Mixtral_AI_CyberTron_Ultra |
|
pipeline_tag: text-generation |
|
inference: false |
|
metrics: |
|
- accuracy |
|
- bertscore |
|
- bleu |
|
- brier_score |
|
- cer |
|
- character |
|
- charcut_mt |
|
- chrf |
|
- code_eval |
|
datasets: |
|
- gretelai/synthetic_text_to_sql |
|
- HuggingFaceTB/cosmopedia |
|
- teknium/OpenHermes-2.5 |
|
- Open-Orca/SlimOrca |
|
- Open-Orca/OpenOrca |
|
- cognitivecomputations/dolphin-coder |
|
- databricks/databricks-dolly-15k |
|
- yahma/alpaca-cleaned |
|
- uonlp/CulturaX |
|
- mwitiderrick/SwahiliPlatypus |
|
- swahili |
|
- Rogendo/English-Swahili-Sentence-Pairs |
|
- ise-uiuc/Magicoder-Evol-Instruct-110K |
|
- meta-math/MetaMathQA |
|
quantized_by: Suparious |
|
--- |
|
# LeroyDyer/Mixtral_AI_CyberTron_Ultra AWQ |
|
|
|
- Model creator: [LeroyDyer](https://huggingface.co/LeroyDyer) |
|
- Original model: [Mixtral_AI_CyberTron_Ultra](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Ultra) |
|
|
|
## Model Summary |
|
|
|
What does he NOT KNOW ! that is the question! |
|
|
|
### MOTTO FOR MODEL! |
|
|
|
## Models are the same as loras , take them with light weight they are like tablets of knowledge! |
|
Exactly ! ( models / loras ? is there a difference ? only mega merges make a true difference ! |
|
the small merges are just applying an adapter lol - Its in there somewhere?) |
|
|
|
### Ok Its a Great MODEL ! (My Favorite Goto Brain now ! - will be fine tuned even more ! (if i get cloud credits)) |
|
|
|
|