--- base_model: - HuggingFaceH4/mistral-7b-grok - OpenPipe/mistral-ft-optimized-1218 library_name: transformers tags: - mergekit - merge - 4-bit - AWQ - text-generation - autotrain_compatible - endpoints_compatible pipeline_tag: text-generation inference: false quantized_by: Suparious --- # hibana2077/Pioneer-2x7B AWQ - Model creator: [hibana2077](https://huggingface.co/hibana2077) - Original model: [Pioneer-2x7B](https://huggingface.co/hibana2077/Pioneer-2x7B) ## Model Summary This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). This model was merged using the SLERP merge method. The following models were included in the merge: * [HuggingFaceH4/mistral-7b-grok](https://huggingface.co/HuggingFaceH4/mistral-7b-grok) * [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218)