---
license: apache-2.0
language:
- en
---
# TinyMix-8x1b
This model is MoE consisting of 8 experts of [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T)
This model is untrained, and will likely perform worse than the dense version.
Will start training it very soon.
Idea by eastwind, who did it for the [chat version of the model](https://huggingface.co/eastwind/tinymix-8x1b-chat).