metadata
base_model: Locutusque/Selocan-2x7B-v1
inference: false
library_name: transformers
license: apache-2.0
merged_models:
- TURKCELL/Turkcell-LLM-7b-v1
- NovusResearch/Novus-7b-tr_v1
pipeline_tag: text-generation
quantized_by: Suparious
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- TURKCELL/Turkcell-LLM-7b-v1
- NovusResearch/Novus-7b-tr_v1
ozayezerceli/Selocan-2x7B-v1 AWQ
- Model creator: ozayezerceli
- Original model: Selocan-2x7B-v1
Model Summary
Selocan-2x7B-v1 is a Mixture of Experts (MoE) made with the following models using LazyMergekit: