File size: 1,702 Bytes
a2a9279 98d2528 a2a9279 98d2528 a2a9279 98d2528 a2a9279 98d2528 a2a9279 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
---
language:
- en
license: apache-2.0
library_name: transformers
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
- code
- 'medical '
- farmer
- doctor
- Mega-Series
- Cyber-Series
- Role-Play
- Self-Rag
- ThinkingBot
- milestone
- mega-series
- SpydazWebAI
base_model: LeroyDyer/Mixtral_AI_CyberTron_Ultra
pipeline_tag: text-generation
inference: false
metrics:
- accuracy
- bertscore
- bleu
- brier_score
- cer
- character
- charcut_mt
- chrf
- code_eval
datasets:
- gretelai/synthetic_text_to_sql
- HuggingFaceTB/cosmopedia
- teknium/OpenHermes-2.5
- Open-Orca/SlimOrca
- Open-Orca/OpenOrca
- cognitivecomputations/dolphin-coder
- databricks/databricks-dolly-15k
- yahma/alpaca-cleaned
- uonlp/CulturaX
- mwitiderrick/SwahiliPlatypus
- swahili
- Rogendo/English-Swahili-Sentence-Pairs
- ise-uiuc/Magicoder-Evol-Instruct-110K
- meta-math/MetaMathQA
quantized_by: Suparious
---
# LeroyDyer/Mixtral_AI_CyberTron_Ultra AWQ
- Model creator: [LeroyDyer](https://huggingface.co/LeroyDyer)
- Original model: [Mixtral_AI_CyberTron_Ultra](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Ultra)
## Model Summary
What does he NOT KNOW ! that is the question!
### MOTTO FOR MODEL!
## Models are the same as loras , take them with light weight they are like tablets of knowledge!
Exactly ! ( models / loras ? is there a difference ? only mega merges make a true difference !
the small merges are just applying an adapter lol - Its in there somewhere?)
### Ok Its a Great MODEL ! (My Favorite Goto Brain now ! - will be fine tuned even more ! (if i get cloud credits))
|