2x7B AWQ
Collection
Mixture of experts 2 x 7B.
•
20 items
•
Updated
This is a merge of pre-trained language models created using mergekit.
This model was merged using the SLERP merge method.
The following models were included in the merge:
Base model
hibana2077/Pioneer-2x7B