base_model: LeroyDyer/Mixtral_AI_MiniTron_Chat | |
library_name: transformers | |
tags: | |
- 4-bit | |
- AWQ | |
- text-generation | |
- autotrain_compatible | |
- endpoints_compatible | |
pipeline_tag: text-generation | |
inference: false | |
quantized_by: Suparious | |
# LeroyDyer/Mixtral_AI_MiniTron_Chat AWQ | |
- Model creator: [LeroyDyer](https://huggingface.co/LeroyDyer) | |
- Original model: [Mixtral_AI_MiniTron_Chat](https://huggingface.co/LeroyDyer/Mixtral_AI_MiniTron_Chat) | |
## Model Summary | |
these little one are easy to train for task !!! :: | |
They already have some training (not great) | |
But they can take more and more | |
(and being MISTRAL they can takes lora modules!) | |
Rememeber to add training on to the lora you merge withit : ie load the lora and train a few cycle on the same data that was applied in the p=lora (ie 20 Steps ) and | |
See it it took hold then merge IT! | |
- **Developed by:** LeroyDyer | |
- **License:** apache-2.0 | |
- **Finetuned from model :** LeroyDyer/Mixtral_AI_MiniTron | |